diff --git a/README.md b/README.md
index 7289e58dbea773fdd2c2a05db800e77946e2236e..8e9b07b6a4a4d9b4382f0c116095741ba1edfb5f 100644
--- a/README.md
+++ b/README.md
@@ -1,93 +1,40 @@
-# vlkb-soda
+## vlkb-soda
 
+is a web-application to access astronomical data stored in FITS-files. It consists of
 
+- **vlkb-soda** web-application and associated **vlkbd** daemon to access FITS file contents based on VO SODA
+- **vlkb** command line utility to perform some of the functionalities of the web-applications on command line
 
-## Getting started
+It can be used with vlkb-search to provide complete data discovery and access service:
+- **vlkb-search** web-application and associated **vlkb-obscore** command line tool based on VO ObsCore table
 
-To make it easy for you to get started with GitLab, here's a list of recommended next steps.
+The vlkb-search web-app is available from https://ict.inaf.it/gitlab/ViaLactea/vlkb-search .
 
-Already a pro? Just edit this README.md and make it your own. Want to make it easy? [Use the template at the bottom](#editing-this-readme)!
+Dockerized version is available from this projects registry git.ia2.inaf.it:5050/vialactea/vlkb-soda.
 
-## Add your files
-
-- [ ] [Create](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#create-a-file) or [upload](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#upload-a-file) files
-- [ ] [Add files using the command line](https://docs.gitlab.com/ee/gitlab-basics/add-file.html#add-a-file-using-the-command-line) or push an existing Git repository with the following command:
-
-```
-cd existing_repo
-git remote add origin https://www.ict.inaf.it/gitlab/ViaLactea/vlkb-soda.git
-git branch -M main
-git push -uf origin main
-```
-
-## Integrate with your tools
-
-- [ ] [Set up project integrations](https://www.ict.inaf.it/gitlab/ViaLactea/vlkb-soda/-/settings/integrations)
-
-## Collaborate with your team
-
-- [ ] [Invite team members and collaborators](https://docs.gitlab.com/ee/user/project/members/)
-- [ ] [Create a new merge request](https://docs.gitlab.com/ee/user/project/merge_requests/creating_merge_requests.html)
-- [ ] [Automatically close issues from merge requests](https://docs.gitlab.com/ee/user/project/issues/managing_issues.html#closing-issues-automatically)
-- [ ] [Enable merge request approvals](https://docs.gitlab.com/ee/user/project/merge_requests/approvals/)
-- [ ] [Set auto-merge](https://docs.gitlab.com/ee/user/project/merge_requests/merge_when_pipeline_succeeds.html)
-
-## Test and Deploy
-
-Use the built-in continuous integration in GitLab.
-
-- [ ] [Get started with GitLab CI/CD](https://docs.gitlab.com/ee/ci/quick_start/index.html)
-- [ ] [Analyze your code for known vulnerabilities with Static Application Security Testing (SAST)](https://docs.gitlab.com/ee/user/application_security/sast/)
-- [ ] [Deploy to Kubernetes, Amazon EC2, or Amazon ECS using Auto Deploy](https://docs.gitlab.com/ee/topics/autodevops/requirements.html)
-- [ ] [Use pull-based deployments for improved Kubernetes management](https://docs.gitlab.com/ee/user/clusters/agent/)
-- [ ] [Set up protected environments](https://docs.gitlab.com/ee/ci/environments/protected_environments.html)
-
-***
-
-# Editing this README
-
-When you're ready to make this README your own, just edit this file and use the handy template below (or feel free to structure it however you want - this is just a starting point!). Thanks to [makeareadme.com](https://www.makeareadme.com/) for this template.
-
-## Suggestions for a good README
-
-Every project is different, so consider which of these sections apply to yours. The sections used in the template are suggestions for most open source projects. Also keep in mind that while a README can be too long and detailed, too long is better than too short. If you think your README is too long, consider utilizing another form of documentation rather than cutting out information.
-
-## Name
-Choose a self-explaining name for your project.
+## Installation
 
-## Description
-Let people know what your project can do specifically. Provide context and add a link to any reference visitors might be unfamiliar with. A list of Features or a Background subsection can also be added here. If there are alternatives to your project, this is a good place to list differentiating factors.
+There are rpm, deb and war packages avaialable for Debian, CentOS and Fedora.
 
-## Badges
-On some READMEs, you may see small images that convey metadata, such as whether or not all the tests are passing for the project. You can use Shields to add some to your README. Many services also have instructions for adding a badge.
+### Install from packages (rpm/deb and war)
 
-## Visuals
-Depending on what you are making, it can be a good idea to include screenshots or even a video (you'll frequently see GIFs rather than actual videos). Tools like ttygif can help, but check out Asciinema for a more sophisticated method.
+There is a war-package for the cutout web-application
 
-## Installation
-Within a particular ecosystem, there may be a common way of installing things, such as using Yarn, NuGet, or Homebrew. However, consider the possibility that whoever is reading your README is a novice and would like more guidance. Listing specific steps helps remove ambiguity and gets people to using your project as quickly as possible. If it only runs in a specific context like a particular programming language version or operating system or has dependencies that have to be installed manually, also add a Requirements subsection.
+- vlkb-soda-X.Y.Z.war
 
-## Usage
-Use examples liberally, and show the expected output if you can. It's helpful to have inline the smallest example of usage that you can demonstrate, while providing links to more sophisticated examples if they are too long to reasonably include in the README.
+And two packages for linux executables (deb or rpm)
 
-## Support
-Tell people where they can go to for help. It can be any combination of an issue tracker, a chat room, an email address, etc.
+- vlkbd-X.Y.Z.deb implements the cutout engine and should be installed together with vlkb-cutout-\*.war
+- vlkb-X.Y.Z.deb is a optional utility
 
-## Roadmap
-If you have ideas for releases in the future, it is a good idea to list them in the README.
+Additionally optional vlkb-obscore utility for the vlkb-search web-app:
+- vlkb-obscore-X.Y.Z.deb is a optional tool to create ObsCore table for vlkb-search-\*.war
 
-## Contributing
-State if you are open to contributions and what your requirements are for accepting them.
+To download version X.Y, add one of the above package names to
 
-For people who want to make changes to your project, it's helpful to have some documentation on how to get started. Perhaps there is a script that they should run or some environment variables that they need to set. Make these steps explicit. These instructions could also be useful to your future self.
+```bash
+FIXME   curl -O --header "PRIVATE-TOKEN: <security-token>"  "https://ict.inaf.it/gitlab/api/v4/projects/79/packages/generic/vlkb-datasets/X.Y/<package-name>"
 
-You can also document commands to lint the code or run tests. These steps help to ensure high code quality and reduce the likelihood that the changes inadvertently break something. Having instructions for running tests is especially helpful if it requires external setup, such as starting a Selenium server for testing in a browser.
-
-## Authors and acknowledgment
-Show your appreciation to those who have contributed to the project.
+```
 
-## License
-For open source projects, say how it is licensed.
 
-## Project status
-If you have run out of energy or time for your project, put a note at the top of the README saying that development has slowed down or stopped completely. Someone may choose to fork your project or volunteer to step in as a maintainer or owner, allowing your project to keep going. You can also make an explicit request for maintainers.
diff --git a/auth/Makefile b/auth/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..36c0b4b3ada259db9223bb0beab4ac31cc12d664
--- /dev/null
+++ b/auth/Makefile
@@ -0,0 +1,41 @@
+################################################################################
+LIB_DIR  = target/lib
+TARGET = $(LIB_DIR)/vlkb-auth.jar
+CLASS_DIR = target/classes
+VERSION ?= $(shell git describe)
+################################################################################
+EXT_LIB_DIR  = ../java-libs/lib
+################################################################################
+JC = javac
+JFLAGS = -g
+CLASSPATH = $(CLASS_DIR):$(EXT_LIB_DIR)/*
+################################################################################
+SRC_DIR = src/main/java:src/test/java
+SOURCES  = $(wildcard src/*Filter.java) src/main/java/AuthPolicy.java src/test/java/Main.java
+################################################################################
+
+all : build
+
+build : jar
+
+.PHONY: classes makedirs clean test
+
+makedirs:
+	mkdir -p $(CLASS_DIR) $(LIB_DIR)
+
+classes: makedirs
+	$(JC) $(JFLAGS) -cp $(CLASSPATH) -sourcepath $(SRC_DIR) -d $(CLASS_DIR) $(SOURCES)
+
+jar: classes
+	jar -cf $(TARGET) -C $(CLASS_DIR) .
+
+clean :
+	rm -fr $(CLASS_DIR) $(LIB_DIR)
+	rmdir target
+
+run-test:
+	java -cp $(CLASSPATH) Main ../test/token.base64
+
+test:
+	@echo "SOURCES  : "$(SOURCES)
+
diff --git a/auth/permissions-table.sql b/auth/permissions-table.sql
new file mode 100644
index 0000000000000000000000000000000000000000..8ac6fd17eee6e20105ba8a3097509e5716aac1a2
--- /dev/null
+++ b/auth/permissions-table.sql
@@ -0,0 +1,6 @@
+
+
+create table permissions ( obs_publisher_did varchar primary key, groups TEXT[] NULL );
+
+INSERT INTO permissions (obs_publisher_did, groups) VALUES ('ivo://auth.example.org/datasets/fits?cubes/part-Eridanus_full_image_V3.fits#0','{AllPrivate}');
+
diff --git a/auth/resources/Backup/auth.properties-AuthLibExample b/auth/resources/Backup/auth.properties-AuthLibExample
new file mode 100644
index 0000000000000000000000000000000000000000..d032f0cb0c9f8f50485e411524038de94b0982be
--- /dev/null
+++ b/auth/resources/Backup/auth.properties-AuthLibExample
@@ -0,0 +1,14 @@
+client_id=test
+client_secret=test-secret
+rap_uri=http://localhost/rap-ia2
+store_state_on_login_endpoint=true
+scope=openid read:userspace write:userspace read:fileserver write:fileserver read:gms read:rap
+
+gms_uri=http://localhost:8082/gms
+groups_autoload=false
+
+# default values
+access_token_endpoint=/auth/oauth2/token
+user_authorization_endpoint=/auth/oauth2/authorize
+check_token_endpoint=/auth/oauth2/token
+jwks_endpoint=/auth/oidc/jwks
diff --git a/auth/resources/Backup/shiro.ini b/auth/resources/Backup/shiro.ini
new file mode 100644
index 0000000000000000000000000000000000000000..4324a25031ec4eab8cdd1b1aa288afd17f3300eb
--- /dev/null
+++ b/auth/resources/Backup/shiro.ini
@@ -0,0 +1,40 @@
+#
+# Copyright (c) 2013 Les Hazlewood and contributors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# INI configuration is very powerful and flexible, while still remaining succinct.
+# Please http://shiro.apache.org/configuration.html and
+# http://shiro.apache.org/web.html for more.
+
+[main]
+shiro.loginUrl = /login.jsp
+cacheManager = org.apache.shiro.cache.MemoryConstrainedCacheManager
+securityManager.cacheManager = $cacheManager
+#securityManager.realm = $stormpathRealm
+
+[users]
+# syntax: user = password , roles
+vialactea = ia2vlkb, ROLE_ADMIN
+
+[roles]
+ROLE_ADMIN = *
+
+[urls]
+#/login.jsp = authc
+/logout = logout
+/** = authcBasic
+#/ivoa/resources/basic/** = authcBasic
+#/ivoa/resources/full/** = authc
+
diff --git a/auth/resources/auth.properties b/auth/resources/auth.properties
new file mode 100644
index 0000000000000000000000000000000000000000..c9c8aee27f0017b03a10a17896236eae4a93a018
--- /dev/null
+++ b/auth/resources/auth.properties
@@ -0,0 +1,10 @@
+rap_uri=https://sso.ia2.inaf.it/rap-ia2
+gms_uri=https://sso.ia2.inaf.it/gms
+client_id=vospace_ui_demo
+client_secret=VOSpaceDemo123
+
+groups_autoload=true
+store_state_on_login_endpoint=true
+scope=openid email profile read:rap
+
+allow_anonymous_access=true
diff --git a/auth/resources/iamtoken.properties b/auth/resources/iamtoken.properties
new file mode 100644
index 0000000000000000000000000000000000000000..d275d68bee277ed3450eee1349d4a3a2c48210dc
--- /dev/null
+++ b/auth/resources/iamtoken.properties
@@ -0,0 +1,13 @@
+
+# certificates endpoint
+#jwks_url=
+introspect=
+client_name=
+client_password=
+
+# account created for the service
+resource_id=
+
+# username for non-authenticated requests
+non_authn_username=anonymous
+
diff --git a/auth/resources/neatoken.properties b/auth/resources/neatoken.properties
new file mode 100644
index 0000000000000000000000000000000000000000..839e15d714346acd080d3bc7474dc164e97a4af8
--- /dev/null
+++ b/auth/resources/neatoken.properties
@@ -0,0 +1,10 @@
+
+# certificates endpoint
+jwks_url=
+
+# account created for the service
+resource_id=
+
+# username for non-authenticated requests
+non_authn_username=anonymous
+
diff --git a/auth/src/main/java/AuthPolicy.java b/auth/src/main/java/AuthPolicy.java
new file mode 100644
index 0000000000000000000000000000000000000000..8c3d60f3b88d9899726e44c213a751d6cbb3a203
--- /dev/null
+++ b/auth/src/main/java/AuthPolicy.java
@@ -0,0 +1,338 @@
+
+// external inputs:
+// User data (name and list of groups)
+// List of PublisherDid's as param to some app-function (multi-cutout, merge)
+// Database connection (from Settings)
+
+// For Non-authenticated requests two behaviours possible:
+// A, setups with configured security: return only PUBLIC data
+// B, setups wihtout need of security: access all data
+// Currently B supported: Vlkb security filters will always set UsePrincipal.
+// Security filters could reserve 'anonymous' user for non-authenticated requests, if needed.
+// So missing UserPrincipal is interpreted as setup without security filters - full access allowed.
+
+
+import java.util.logging.Logger;
+
+import java.io.PrintWriter;
+import java.security.Principal;
+
+import java.util.List;
+import java.util.ArrayList;
+import java.util.LinkedList;
+import java.util.Collections;
+import java.util.List;
+import java.util.Set;
+import java.util.HashSet;
+import java.util.Arrays;
+import java.util.ListIterator;
+
+
+
+public class AuthPolicy
+{
+   private static final Logger LOGGER = Logger.getLogger(AuthPolicy.class.getName());
+
+   enum Access { PUBLIC_ONLY, PUBLIC_AND_AUTHORIZED_PRIVATE };
+   private Access access;
+
+   private String   userName;
+   private String[] userGroups;
+   private boolean userGroupsValid;
+
+   private String dbConnUrl;
+   private String dbUserName;
+   private String dbPassword;
+
+
+   public AuthPolicy(String userName, String[] userGroups)
+   {
+      this.userName   = userName;
+      this.userGroups = userGroups;
+      this.userGroupsValid = true;
+
+      access = Access.PUBLIC_AND_AUTHORIZED_PRIVATE;
+
+      LOGGER.info("User [Groups]: " + userName + " [ " + String.join(" ", userGroups) + " ]" );
+   }
+
+
+
+
+   public AuthPolicy(Principal principal)
+   {
+      if(principal == null)
+      {
+         access = Access.PUBLIC_ONLY;
+         userName = null;
+         userGroups = null;
+         userGroupsValid = false;
+         LOGGER.info("Non authenticated request (UserPrincipal null in HttpServletRequest)");
+      }
+      else
+      {
+         if(principal instanceof VlkbUser)
+         {
+            VlkbUser vlkbUser = (VlkbUser) principal;
+
+            userName   = vlkbUser.getName();
+            userGroups = vlkbUser.getGroupsAsArray();
+            userGroupsValid = true;
+
+            access = Access.PUBLIC_AND_AUTHORIZED_PRIVATE;
+
+            LOGGER.info("User [Groups]: " + userName + " [ " + String.join(" ", userGroups) + " ]" );
+         }
+         else
+         {
+            userName = principal.getName();
+            LOGGER.info("DBG principal not instance of VlkbUser, but has user-name: " + userName);
+            userGroups = new String[]{""};//{"VLKB.groupA", "AllPrivate"}; // was for shiro
+            userGroupsValid = true;
+            access = Access.PUBLIC_AND_AUTHORIZED_PRIVATE;
+            //throw new IllegalArgumentException("UserPrincipal is not of expected type");
+         }
+      }
+   }
+
+
+
+   public String getUserName()
+   {
+      return userName;
+   }
+
+   public boolean getUserGroupsValid()
+   {
+      return userGroupsValid;
+   }
+
+
+   public String[] getUserGroups()
+   {
+      return userGroups;
+   }
+
+   public String getUserGroupsSqlFormat()
+   {
+      if( (userGroups != null) && (userGroups.length > 0) )
+      {
+         return "\"" + String.join("\",\"" , userGroups) + "\"";
+      }
+      else
+      {
+         return null;
+      }
+   }
+
+   public String getUserGroupsAsString(String separator)
+   {
+      if( (userGroups != null) && (userGroups.length > 0) )
+      {
+         return String.join(separator, userGroups);
+      }
+      else
+      {
+         return null;
+      }
+   }
+
+
+
+
+   public String getAccessPolicy()
+   {
+      return access.name(); // returns enum as string
+   }
+
+
+
+   public void toXML(PrintWriter writer)
+   {
+      writer.println("<AccessPolicy>" + this.getAccessPolicy() + "</AccessPolicy>");
+      String ug = getUserGroupsAsString(" ");
+      if(userName   != null) writer.println("<UserName>" + userName + "</UserName>");
+      if(ug         != null) writer.println("<GroupNames>" + ug + "</GroupNames>");
+   }
+
+
+
+   public String[] filterAuthorized(String[] pubdidArr, String dbConnUrl, String dbUserName, String dbPassword)
+   {
+      this.dbConnUrl = dbConnUrl;
+      this.dbUserName = dbUserName;
+      this.dbPassword = dbPassword;
+
+      LOGGER.info("with String[] trace");
+      return filterAuthorized(new ArrayList<String>(Arrays.asList(pubdidArr)), dbConnUrl);
+   }
+
+   private String[] filterAuthorized(ArrayList<String> pubdidList, String dbConnUrl)
+   {
+      //LOGGER.info("with List <String> trace");
+      switch(access)
+      {
+         case PUBLIC_ONLY :
+            filterNotPublic(pubdidList, dbConnUrl);
+            break;
+
+         case PUBLIC_AND_AUTHORIZED_PRIVATE :
+            filterNotAuthorized(pubdidList, dbConnUrl);
+            break;
+
+         default :
+            assert false : "Unrecoginzed  access : " + access;
+      }
+      return pubdidList.toArray(new String[0]); 
+   }
+
+
+   private void filterNotPublic(ArrayList<String> pubdids, String dbConnUrl)
+   {
+      LOGGER.info("trace");
+      assert pubdids != null;
+      //LOGGER.info("PublisherDID list original : " + String.join(" ", pubdids));
+
+      List<AuthPolicyDb.PubdidGroups> privateUniqPubdids = db_queryPrivateUniqPubdidGroups(dbConnUrl, pubdids);
+      List<String> notAuthorizedUniqPubdids = pubdidsNotPublic(privateUniqPubdids, userGroups);
+
+      LOGGER.info("AuthZ removes: " + String.join(" ", notAuthorizedUniqPubdids));
+
+      removeNotAuthorized(pubdids, notAuthorizedUniqPubdids);
+
+      //LOGGER.info("PublisherDID list filtered : " + (pubdids.isEmpty() ? "" : String.join(" ", pubdids)));
+   }
+
+
+   private List<String> pubdidsNotPublic(List<AuthPolicyDb.PubdidGroups> pubdidList, String[] userGroups)
+   {
+      LOGGER.info("trace");
+      //LOGGER.info("userGroups: " + String.join(" ",userGroups));
+
+      List<String> pubdidsNotAuthorizedList = new LinkedList<String>();
+      ListIterator<AuthPolicyDb.PubdidGroups> it = pubdidList.listIterator();
+
+      while (it.hasNext())
+      {
+         AuthPolicyDb.PubdidGroups pubdidGroups = it.next();
+
+         //LOGGER.info(pubdidGroups.pubdid + " : " + String.join(" ",pubdidGroups.groups));
+
+         if( true )// isIntersectionEmpty(pubdidGroups.groups, userGroups) )
+         {
+            pubdidsNotAuthorizedList.add(pubdidGroups.pubdid);
+         }
+      }
+
+      return pubdidsNotAuthorizedList;
+   }
+
+
+
+   private void filterNotAuthorized(ArrayList<String> pubdids, String dbConnUrl)
+   {
+      LOGGER.info("trace");
+      assert pubdids != null;
+      //LOGGER.info("PublisherDID list original : " + String.join(" ", pubdids));
+
+      List<AuthPolicyDb.PubdidGroups> privateUniqPubdids = db_queryPrivateUniqPubdidGroups(dbConnUrl, pubdids);
+      List<String> notAuthorizedUniqPubdids = pubdidsNotAuthorized(privateUniqPubdids, userGroups);
+
+      LOGGER.info("AuthZ removes: " + String.join(" ", notAuthorizedUniqPubdids));
+
+      removeNotAuthorized(pubdids, notAuthorizedUniqPubdids);
+
+      //LOGGER.info("PublisherDID list filtered : " + (pubdids.isEmpty() ? "" : String.join(" ", pubdids)));
+   }
+
+
+
+   private void removeNotAuthorized(ArrayList<String> pubdids, List<String> notAuthorizedUniqPubdids)
+   {
+      ListIterator<String> itr = pubdids.listIterator();
+      while (itr.hasNext())
+      {
+         String pubdid = itr.next();
+
+         for(String notAuthPubdid : notAuthorizedUniqPubdids)
+         {
+            if (pubdid.equals(notAuthPubdid)) itr.remove();
+         }
+      }
+
+      return;
+   }
+
+
+
+   private List<AuthPolicyDb.PubdidGroups> db_queryPrivateUniqPubdidGroups(String dbConnUrl, List<String> pubdids)
+   {
+      AuthPolicyDb adb;
+      synchronized(AuthPolicyDb.class)
+      {
+         AuthPolicyDb.dbConnUrl  = this.dbConnUrl;
+         AuthPolicyDb.dbUserName = this.dbUserName;
+         AuthPolicyDb.dbPassword = this.dbPassword;
+
+         adb = new AuthPolicyDb();
+      }
+
+      Set<String> uniqPubdids = new HashSet<String>(pubdids);
+
+      if(uniqPubdids.isEmpty())
+      {
+         List<AuthPolicyDb.PubdidGroups> privatePubdidGroups = Collections.emptyList();
+         return privatePubdidGroups;
+      }
+      else
+      {
+         // FIXME handle DB-exceptions
+         List<AuthPolicyDb.PubdidGroups> privatePubdidGroups = adb.queryGroupsPrivateOnly(uniqPubdids);
+         return privatePubdidGroups;
+      }
+   }
+
+
+
+   private List<String> pubdidsNotAuthorized(List<AuthPolicyDb.PubdidGroups> pubdidList, String[] userGroups)
+   {
+      LOGGER.info("trace");
+      //LOGGER.info("userGroups: " + String.join(" ",userGroups));
+
+      List<String> pubdidsNotAuthorizedList = new LinkedList<String>();
+      ListIterator<AuthPolicyDb.PubdidGroups> it = pubdidList.listIterator();
+
+      while (it.hasNext())
+      {
+         AuthPolicyDb.PubdidGroups pubdidGroups = it.next();
+
+         //LOGGER.info(pubdidGroups.pubdid + " : " + String.join(" ",pubdidGroups.groups));
+
+         if( isIntersectionEmpty(pubdidGroups.groups, userGroups) )
+         {
+            pubdidsNotAuthorizedList.add(pubdidGroups.pubdid);
+         }
+      }
+
+      return pubdidsNotAuthorizedList;
+   }
+
+
+
+   private boolean isIntersectionEmpty(String[] stringsA, String[] stringsB)
+   {
+      for(String strA : stringsA)
+         for(String strB : stringsB)
+         {
+            if(strA.equals(strB))
+            {
+               return false;
+            }
+         }
+      return true;
+   }
+
+
+
+}
+
diff --git a/auth/src/main/java/AuthPolicyDb.java b/auth/src/main/java/AuthPolicyDb.java
new file mode 100644
index 0000000000000000000000000000000000000000..9f737ece7415810ae6d92fecca867f70d7385675
--- /dev/null
+++ b/auth/src/main/java/AuthPolicyDb.java
@@ -0,0 +1,256 @@
+
+import java.util.logging.Logger;
+
+// mySQL access
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.Driver;
+import java.sql.ResultSet;
+import java.sql.Statement;
+import java.sql.SQLException;
+import java.sql.Array;
+// import javax.sql.*; needed if using DataSource instead of DriverManager for DB-connections
+
+import java.net.MalformedURLException;
+import java.net.URI;
+import java.net.URL;
+import java.net.URLClassLoader;
+
+import java.util.Enumeration;
+import java.util.List;
+import java.util.LinkedList;
+import java.util.Set;
+import java.util.HashSet;
+import java.util.ArrayList;
+
+import java.lang.ClassNotFoundException;
+
+
+
+public class AuthPolicyDb
+{
+   private static final Logger LOGGER = Logger.getLogger(AuthPolicyDb.class.getName());
+
+   private static final String DB_DRIVER = "org.postgresql.Driver";
+   //private static final Settings settings = Settings.getInstance();
+   //static public Settings.DBConn dbconn = settings.dbConn;
+   static public String dbConnUrl;
+   static public String dbUserName;
+   static public String dbPassword;
+
+   private Connection conn;
+   private Statement  st;
+   private ResultSet  res;
+
+   AuthPolicyDb(){
+      conn = null;
+      st   = null;
+      res  = null;
+   }
+
+
+
+   public class PubdidGroups
+   {
+      String pubdid;
+      String[] groups;
+      PubdidGroups(String pubdid, String[] groups)
+      {
+         this.pubdid = pubdid;
+         this.groups = groups;
+      }
+   }
+
+
+/*
+   private String convertToVlkbPubdid(String obscorePubdid)
+   {
+      final String PUBDID_PREFIX = dbconn.obscorePublisher;
+
+      if(obscorePubdid.startsWith(PUBDID_PREFIX))
+         return obscorePubdid.substring( PUBDID_PREFIX.length() );
+      else
+         return obscorePubdid;
+   }
+
+   private Set<String> convertToObscorePubdids(Set<String> vlkbPubdids)
+   {
+      final String PUBDID_PREFIX = dbconn.obscorePublisher;
+
+      Set<String> obscorePubdids = new HashSet<String>();
+
+      for(String pubdid : vlkbPubdids)
+      {
+         String obscorePubdid =  "\'" + PUBDID_PREFIX + pubdid + "\'";
+         obscorePubdids.add(obscorePubdid);
+      }
+
+      return obscorePubdids;
+   }
+*/
+
+   public List<PubdidGroups> queryGroupsPrivateOnly(Set<String> uniqPubdids)
+   {
+      //Set<String> uniqObscorePubdids = convertToObscorePubdids(uniqPubdids);
+      Set<String> uniqObscorePubdids = uniqPubdids;
+      String commaSepObscorePubdids  = String.join("\',\'", uniqObscorePubdids);
+
+      assert (commaSepObscorePubdids != null) && (!commaSepObscorePubdids.isEmpty());
+
+      String TheQuery = "SELECT obs_publisher_did,groups FROM obscore "
+         + "WHERE (policy = 'PRIV') AND (obs_publisher_did IN (\'"+commaSepObscorePubdids+"\'));";
+
+      // FIXME use separate table holding  _only_  private data-id's
+      //String TheQuery = "SELECT obs_publisher_did,groups FROM permissions "
+      //   + "WHERE (obs_publisher_did IN (\'"+commaSepObscorePubdids+"\'));";
+
+      //LOGGER.info(TheQuery);
+
+      List<PubdidGroups> pubdidGroups = new LinkedList<PubdidGroups>();
+      try
+      {
+         res = doQuery(TheQuery);
+
+         while (res.next())
+         {
+            //String pubdid   = convertToVlkbPubdid(res.getString("obs_publisher_did"));
+            String pubdid   = res.getString("obs_publisher_did");
+            Array groupsArr = res.getArray("groups");
+
+            String[] groups   = null;
+            if(groupsArr == null)
+               groups = null;
+            else
+               groups = (String[]) groupsArr.getArray();
+
+            PubdidGroups pg = new PubdidGroups(pubdid, groups);
+            pubdidGroups.add(pg); 
+         }
+      }
+      catch (SQLException se)
+      {
+         logSqlExInfo(se);
+         se.printStackTrace();
+      }
+      catch (ClassNotFoundException e)
+      {
+         LOGGER.info("DB driver "+ DB_DRIVER +" not found: " + e.getMessage());
+         e.printStackTrace();
+      }
+      finally
+      {
+         closeAll();
+      }
+
+      return pubdidGroups; 
+   }
+
+
+   private void closeAll()
+   {
+         if(res  != null ) try { res.close(); } catch(Exception e) {LOGGER.info("DB ResultSet::close() failed");}
+         if(st   != null ) try { st.close();  } catch(Exception e) {LOGGER.info("DB Statement::close() failed");}
+         if(conn != null ) try { conn.close();} catch(Exception e) {LOGGER.info("DB Connection::close() failed");} 
+  }
+
+   private void logSqlExInfo(SQLException se){
+
+      /* dbconn.print_class_vars(); */
+
+      System.err.println("SQLState : " + se.getSQLState());
+      System.err.println("ErrorCode: " + se.getErrorCode());
+      System.err.println("Message  : " + se.getMessage());
+      Throwable t = se.getCause();
+      while(t != null) {
+         System.err.println("Cause: " + t);
+         t = t.getCause();
+      }
+   }
+
+
+
+   private ResultSet doQuery(String TheQuery)
+      throws SQLException, ClassNotFoundException 
+   {
+
+      /* https://docs.oracle.com/javase/tutorial/jdbc/basics/connecting.html :
+         Any JDBC 4.0 drivers that are found in your class path are automatically loaded.
+         (However, you must manually load any drivers prior to JDBC 4.0 with the method
+         Class.forName.)
+         */
+      // try {
+//      Class.forName(DB_DRIVER);
+      /* OR
+         DriverManager.registerDriver(new org.postgresql.Driver());
+         */
+
+      /*LOGGER.info(getClasspathString());*/
+      LOGGER.info(getRegisteredDriverList());
+
+      // FIXME seems DriverManager expects jdbc:postgresql driver scheme, it does not support postgresql:// scheme
+      // additionally:
+      // jdbc:postgresql:// scheme does not support username:password in the URL. 
+      // So:
+      // receive postgresql:// scheme with user:password and convert to jdbc:postgresql://
+      // by extracting userName and password from the URL-string and prepending 'jdbc:'
+      // 
+
+      /*         LOGGER.info("DBMS URL: " + dbConnUrl);
+                 URI dbConnUri = new URI(dbConnUrl);
+
+                 String userInfoString = dbConnUri.getUserInfo(); 
+
+                 if(userInfoString == null) throw new AssertionError("DBMS URL must contain user:password but it is: " + dbConnUrl);
+
+                 String[] userInfo = userInfoString.split(":"); 
+
+                 if(userInfo.length < 2) throw new AssertionError("DBMS URL must contain user:password but it is: " + dbConnUrl);
+
+                 String userName = userInfo[0];
+                 String password = userInfo[1];
+
+                 String dbConnJdbcUrl = "jdbc:" + dbConnUrl.replace(userInfoString + "@", "");
+                 */       LOGGER.info("DBMS URL: " + dbConnUrl);
+      LOGGER.info("DBMS userName: " + dbUserName);
+      LOGGER.info("DBMS password: " + dbPassword);
+
+      conn = DriverManager.getConnection(dbConnUrl, dbUserName, dbPassword);
+
+      st = conn.createStatement();
+
+      // } catch (Exception e){ e.printStackTrace();}
+
+      return st.executeQuery(TheQuery);
+   }
+
+
+   private String getClasspathString() {
+      StringBuffer classpath = new StringBuffer("getClasspathString:\r\n");
+      ClassLoader applicationClassLoader = this.getClass().getClassLoader();
+      if (applicationClassLoader == null) {
+         applicationClassLoader = ClassLoader.getSystemClassLoader();
+      }
+      URL[] urls = ((URLClassLoader)applicationClassLoader).getURLs();
+      for(int i=0; i < urls.length; i++) {
+         classpath.append(urls[i].getFile()).append("\r\n");
+      }
+
+      return classpath.toString();
+   }
+
+
+   private String getRegisteredDriverList()
+   {
+      StringBuffer drvList = new StringBuffer("getRegisteredDriverList:\r\n");
+      for (Enumeration e = DriverManager.getDrivers();
+            e.hasMoreElements(); )
+      {
+         Driver d = (Driver) e.nextElement();
+         String driverClass = d.getClass().getName();
+         drvList.append(driverClass).append("\r\n");	
+      }
+      return drvList.toString();
+   }
+
+
+}
diff --git a/auth/src/main/java/IA2TokenConvFilter.java b/auth/src/main/java/IA2TokenConvFilter.java
new file mode 100644
index 0000000000000000000000000000000000000000..a64c181ed275cda3bfd0af2e7b84a2b37dc0f709
--- /dev/null
+++ b/auth/src/main/java/IA2TokenConvFilter.java
@@ -0,0 +1,130 @@
+
+import it.inaf.ia2.aa.data.User;
+
+import java.io.IOException;
+import java.util.*; // ArrayList<String>
+
+import java.util.logging.Logger;
+import javax.servlet.Filter;
+import javax.servlet.FilterChain;
+import javax.servlet.FilterConfig;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+
+
+import javax.servlet.http.HttpServletRequestWrapper;
+import java.security.Principal;
+
+
+public class IA2TokenConvFilter implements Filter
+{
+  private static final Logger LOGGER = Logger.getLogger("IA2TokenConvFilter");
+
+   @Override
+   public void init(FilterConfig fc) throws ServletException
+   {
+      LOGGER.info("trace");
+   }
+
+   @Override
+   public void destroy()
+   {
+      LOGGER.info("trace");
+   }
+
+   @Override
+   public void doFilter(ServletRequest req, ServletResponse res, FilterChain chain)
+                   throws IOException, ServletException
+   {
+      LOGGER.info("trace");
+
+        HttpServletRequest  request  = (HttpServletRequest)  req;
+        HttpServletResponse response = (HttpServletResponse) res;
+
+        String authHeader = request.getHeader("Authorization");
+        if (authHeader != null)
+        {
+            LOGGER.info("Authorization header: " + authHeader.substring(0, 7+60) + " ...");
+            if (authHeader.startsWith("Bearer "))
+            {
+
+
+               Principal principal = request.getUserPrincipal();
+               if(principal == null)
+               {
+                   LOGGER.warning("User principal is null");
+                   response.sendError(500, "Internal error - User principal is not correct");
+                   return;
+               }
+
+               VlkbUser user = new VlkbUser();
+
+               if(principal instanceof it.inaf.ia2.aa.data.User)
+               {
+                  it.inaf.ia2.aa.data.User alUser = (it.inaf.ia2.aa.data.User) principal;
+
+                  String userId       = alUser.getName();//UserId
+                  String userLabel    = alUser.getUserLabel();
+                  List<String> groups = alUser.getGroups();
+
+                  // FIXME check is any NULL ?
+
+                  user.setUserId(userId);
+                  user.setUserLabel(userLabel);
+                  user.setGroups(groups);
+               }
+               else
+               {
+                   LOGGER.warning("User principal is incorrect type");
+                   response.sendError(500, "Internal error - User principal is not correct type");
+                   return;
+               }
+
+               HttpServletRequestWrapper requestWithPrincipal
+                             = new RequestWithPrincipal(request, user);
+
+               chain.doFilter(requestWithPrincipal, response);
+               return;
+            }
+            else
+            {
+                LOGGER.warning("Detected Authorization header without Bearer token.");
+            }
+        }
+        else
+        {
+            LOGGER.warning("Request has no Authorization header.");
+            if(request.getUserPrincipal() != null)
+            {
+                   LOGGER.warning("User principal is set however no Authorization header present");
+            //       response.sendError(500, "Internal error - It is not expected that Principal set in request but there is no Auhtprozation in HTTP-Header"); // FIXME use other err code not 500 here
+             //      return;
+            }
+        }
+        chain.doFilter(request, response);
+//         response.sendError(401, "Unauthorized");
+   }
+
+
+
+   private static class RequestWithPrincipal extends HttpServletRequestWrapper
+   {
+      private final VlkbUser user;
+
+         public RequestWithPrincipal(HttpServletRequest request, VlkbUser user)
+               {   
+                        super(request);
+                              this.user = user;
+                                 }   
+
+            @Override
+               public Principal getUserPrincipal() {
+                        return user;
+                           }   
+   }
+
+}
+
diff --git a/auth/src/main/java/IamSigningKeyResolver.java b/auth/src/main/java/IamSigningKeyResolver.java
new file mode 100644
index 0000000000000000000000000000000000000000..ff89893067b2748ea91f6cda0d638eb0b3ea9126
--- /dev/null
+++ b/auth/src/main/java/IamSigningKeyResolver.java
@@ -0,0 +1,153 @@
+
+// 1. HTTPS
+import java.net.URL;
+import java.io.*;
+import javax.net.ssl.HttpsURLConnection;
+
+// 2. json deser
+//import org.codehaus.jackson.map.ObjectMapper;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.JsonNode;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ArrayNode;
+import com.fasterxml.jackson.databind.node.JsonNodeFactory;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
+import com.fasterxml.jackson.annotation.JsonAutoDetect;
+
+
+// 3, extract PublicKey
+import java.util.Base64;
+import java.io.ByteArrayInputStream;
+import java.security.GeneralSecurityException; 
+import java.security.PublicKey; 
+import java.security.Signature; 
+import java.security.cert.CertificateFactory; 
+import java.security.cert.X509Certificate; 
+
+// 4, validate token
+import java.security.spec.InvalidKeySpecException;
+import java.security.NoSuchAlgorithmException;
+import java.security.Key;
+import java.security.PublicKey;
+import java.security.interfaces.RSAPublicKey;
+import io.jsonwebtoken.Header;
+import io.jsonwebtoken.Claims;
+import io.jsonwebtoken.Jwt;
+import io.jsonwebtoken.Jws;
+import io.jsonwebtoken.JwsHeader;
+import io.jsonwebtoken.Jwts;
+import io.jsonwebtoken.jackson.io.JacksonDeserializer;
+import io.jsonwebtoken.SigningKeyResolverAdapter;
+import io.jsonwebtoken.security.Jwk;
+import io.jsonwebtoken.security.Jwks;
+// only dbg: when keys taken from file, not URL
+import java.nio.file.Files;
+import java.nio.file.Paths;
+
+import java.util.logging.Logger;
+
+public class IamSigningKeyResolver extends SigningKeyResolverAdapter
+{
+   private static final Logger LOGGER = Logger.getLogger(IamSigningKeyResolver.class.getName());
+   private String keysURL;
+
+
+   public IamSigningKeyResolver(String keysUrl) {this.keysURL = keysUrl;}
+
+   @Override
+   public Key resolveSigningKey(JwsHeader jwsHeader, Claims claims)
+   {
+      LOGGER.info( "IamSigningKeyResolver::resolveSigningKey" );
+
+      //inspect the header or claims, lookup and return the signing key
+
+      String keyId = jwsHeader.getKeyId(); //or any other field that you need to inspect
+
+      Key key = null;
+      try
+      {
+         key = lookupVerificationKey(keyId); //implement me
+      }
+      catch(Exception e)
+      {
+         e.printStackTrace();
+      }
+
+      return key;
+   }
+
+
+
+   private Key lookupVerificationKey(String keyId)
+         throws Exception, GeneralSecurityException
+      {
+         LOGGER.info( "IamSigningKeyResolver::lookupVerificationKey" );
+
+         String jsonKeys = doHttps();
+
+         PublicKey pubKey = (PublicKey)getKeyFromKeys(jsonKeys, keyId);
+
+         return pubKey;
+      }
+
+
+   private String doHttps() throws Exception
+   {
+      LOGGER.info("doHttps : " + keysURL);
+
+      URL myUrl = new URL(keysURL);
+      HttpsURLConnection conn = (HttpsURLConnection)myUrl.openConnection();
+      InputStream is = conn.getInputStream();
+      InputStreamReader isr = new InputStreamReader(is);
+      BufferedReader br = new BufferedReader(isr);
+
+      String inputLine;
+      String jsonKeys = ""; 
+      while ((inputLine = br.readLine()) != null) {
+         jsonKeys = jsonKeys + inputLine;
+      }
+
+      br.close();
+
+      return jsonKeys;
+   }
+
+
+   private Key getKeyFromKeys(String jsonKeys, String keyId)
+         throws JsonProcessingException, GeneralSecurityException, IOException
+      {
+         LOGGER.info( "IamSigningKeyResolver::getKeyFromKeys");
+
+         Key key = null;
+
+         ObjectMapper mapper = new ObjectMapper();
+
+         JsonNode keysNode = mapper.readTree(jsonKeys).get("keys");
+         if(keysNode.isArray())
+         {
+            for (JsonNode node : keysNode)
+            {
+               String nodeContent = mapper.writeValueAsString(node);
+
+               LOGGER.info("key: " + nodeContent);
+
+               Jwk<?> jwk = Jwks.parser().build().parse(nodeContent);
+
+               String jwkkid = jwk.getId();
+
+               LOGGER.info("kid-token : " + keyId + "kid-store : " + jwkkid + " key-type: " + jwk.getType());
+
+               if(keyId.equals(jwkkid))
+               {
+                  key = jwk.toKey();
+               }
+            }
+         }
+
+         return key;
+      }
+
+}
+
diff --git a/auth/src/main/java/IamTokenFilter.java b/auth/src/main/java/IamTokenFilter.java
new file mode 100644
index 0000000000000000000000000000000000000000..f434f824fb0f3e03e69398bd23795d69f148d5eb
--- /dev/null
+++ b/auth/src/main/java/IamTokenFilter.java
@@ -0,0 +1,431 @@
+
+import io.jsonwebtoken.JwtException;
+import io.jsonwebtoken.InvalidClaimException;
+import io.jsonwebtoken.Header;
+import io.jsonwebtoken.Claims;
+import io.jsonwebtoken.Jwt;
+import io.jsonwebtoken.Jws;
+import io.jsonwebtoken.Jwts;
+import io.jsonwebtoken.jackson.io.JacksonDeserializer;
+
+import java.security.spec.InvalidKeySpecException;
+import java.security.NoSuchAlgorithmException;
+
+import javax.servlet.http.HttpServletRequestWrapper;
+import java.security.Principal;
+
+import java.io.OutputStreamWriter;
+import java.io.PrintWriter;
+import java.io.IOException;
+import java.util.List; // ArrayList<String>
+import java.util.Map;
+import java.util.HashMap;
+import java.util.*;
+
+import java.util.logging.Logger;
+import javax.servlet.Filter;
+import javax.servlet.FilterChain;
+import javax.servlet.FilterConfig;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+import javax.servlet.ServletOutputStream;
+
+public class IamTokenFilter implements Filter
+{
+   private static final Logger LOGGER = Logger.getLogger("IamTokenFilter");
+   private static final IamTokenSettings settings = IamTokenSettings.getInstance();
+
+   final String RESPONSE_ENCODING = "utf-8";
+
+   final String keysUrl = settings.security.jwksEndpoint;
+   final String INTROSPECT_URL = settings.getIntrospectUrl();
+   final String CLIENT_PASS = settings.getClientName() + ":" + settings.getClientPassword();
+
+
+   @Override
+   public void init(FilterConfig fc) throws ServletException {}
+
+   @Override
+   public void destroy() {}
+
+
+   @Override
+   public void doFilter(ServletRequest req, ServletResponse resp, FilterChain chain)
+      throws IOException, ServletException
+   {
+      String authHeader = ((HttpServletRequest)req).getHeader("Authorization");
+
+      ServletOutputStream  respOutputStream = resp.getOutputStream();
+      PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+
+      if(authHeader==null)
+      {
+         final String AUTH_ERR = "Request without Authorization header. Only authenticated requests allowed.";
+         LOGGER.info(AUTH_ERR);
+         sendAuthenticationError((HttpServletResponse)resp, writer, AUTH_ERR);
+      }
+      else
+      {
+         authHeader = authHeader.trim();
+
+         if (authHeader.startsWith("Bearer ") && (authHeader.length() > "Bearer ".length()))
+         {
+            LOGGER.info("Request with Authorization header and has Bearer entry");
+            String token = authHeader.substring("Bearer ".length()).trim();
+
+            doFilterBearer(req, token, resp, chain);
+         }
+         else
+         {
+            final String AUTH_ERR = "Authorization header with Bearer-token expected, but it starts with : "
+               + authHeader.substring(0, "Bearer ".length()) + "...";
+            LOGGER.info(AUTH_ERR);
+            sendUsageError((HttpServletResponse)resp, writer, AUTH_ERR);
+         }
+      }
+   }
+
+
+
+
+   private void doFilterBearer(ServletRequest req, String token, ServletResponse resp, FilterChain chain)
+         throws IOException, ServletException
+      {
+         HttpServletRequest  request  = (HttpServletRequest) req;
+         HttpServletResponse response = (HttpServletResponse)resp;
+
+         ServletOutputStream  respOutputStream = response.getOutputStream();
+         PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+
+         try
+         {
+            IntrospectResponse insResp = new IntrospectResponse(CLIENT_PASS, INTROSPECT_URL, token);
+
+            if(insResp.isTokenActive())
+            {
+               String idString = request.getParameter("ID");
+               Ivoid ivoid = new Ivoid(idString);
+
+               String ivoidPath = ivoid.getLocalPart();
+               String tokenPath = insResp.getPathFromStorageReadScope();
+
+               LOGGER.info("Path from IVOID: " + ivoidPath);
+               LOGGER.info("Path from token: " + tokenPath);
+
+               if(tokenPath.endsWith(ivoidPath))
+               {
+                  LOGGER.info("Access authorized.");
+                  chain.doFilter(request, response);
+               }
+               else
+               {
+                  final String AUTH_ERR = "Bearer token does not authorize access to : " + ivoidPath;
+                  LOGGER.info(AUTH_ERR);
+                  sendAuthorizationError(response, writer, AUTH_ERR);
+               }
+            }
+            else
+            {
+               final String AUTH_ERR = "Bearer-token is inactive.";
+               LOGGER.info(AUTH_ERR);
+               sendAuthorizationError(response, writer, AUTH_ERR);
+            }
+
+         }
+         catch(IndexOutOfBoundsException ex)
+         {
+            LOGGER.info("IndexOutOfBoundsException: " + ex.getMessage());
+            sendUsageError(response, writer, ex.getMessage());
+         }
+         catch(IllegalArgumentException ex)
+         {
+            LOGGER.info("IllegalArgumentException: " + ex.getMessage());
+            sendUsageError(response, writer, ex.getMessage());
+         }
+         catch(Exception ex)
+         {
+            LOGGER.info("Exception: " + ex.getMessage());
+            ex.printStackTrace();
+            sendError(response, writer, ex.toString());
+         }
+         finally
+         {
+            writer.close();
+            respOutputStream.close();
+         }
+      }
+
+
+
+   // 5. SODA sunc Responses [Table 6]
+   // Success: 200 (Ok) or 204 (NO Content) and set HTTP-Headers: Content-Type & Content_encoding (if applicable)
+   //Error Code          Description
+   //===================================================================================
+   //Error               General error (not covered below)
+   //AuthenticationError Not authenticated
+   //AuthorizationError  Not authorized to access the resource
+   //ServiceUnavailable  Transient error (could succeed with retry)
+   //UsageError          Permanent error (retry pointless)
+   //MultiValuedParamNotSupported  request included multiple values for a parameter
+   //                              but the service only supports a single value 
+
+
+   protected void sendError(HttpServletResponse response, PrintWriter printWriter, String message)
+   {
+      response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
+      response.setContentType("text/plain");
+      printWriter.println("Error : " + message);
+   }
+
+
+   protected void sendAuthenticationError(HttpServletResponse response, PrintWriter printWriter, String message)
+   {
+      response.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
+      response.setContentType("text/plain");
+      printWriter.println("AuthenticationError : " + message);
+   }
+
+
+   protected void sendAuthorizationError(HttpServletResponse response, PrintWriter printWriter, String message)
+   {
+      response.setStatus(HttpServletResponse.SC_FORBIDDEN);
+      response.setContentType("text/plain");
+      printWriter.println("AuthorizationError : " + message);
+   }
+
+
+   protected void sendServiceUnavailable(HttpServletResponse response, PrintWriter printWriter, String message)
+   {
+      response.setStatus(HttpServletResponse.SC_SERVICE_UNAVAILABLE);
+      response.setContentType("text/plain");
+      printWriter.println("ServiceUnavailable : " + message);
+   }
+
+
+   protected void sendUsageError(HttpServletResponse response, PrintWriter printWriter, String message)
+   {
+      response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
+      response.setContentType("text/plain");
+      printWriter.println("UsageError : " + message);
+   }
+
+
+   protected void sendMultiValuedParamNotSupported(HttpServletResponse response, PrintWriter printWriter, String message)
+   {
+      response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
+      response.setContentType("text/plain");
+      printWriter.println("MultiValuedParamNotSupported : " + message);
+   }
+
+
+
+
+
+
+   // Implementation with JWKs endpoint (explicit signiture verification):
+
+   //final String resourceId = settings.security.resourceId; //"vlkb"
+   //final String realmName = "neanias-production";
+   //final String keysUrl = "https://sso.neanias.eu/auth/realms/" + realmName + "/protocol/openid-connect/certs";
+
+   /*/@Override
+     public void OLD_doFilter(ServletRequest req, ServletResponse res, FilterChain chain)
+        throws IOException, ServletException
+     {
+     HttpServletRequest request = (HttpServletRequest) req;
+     HttpServletResponse response = (HttpServletResponse) res;
+
+     String  qString = request.getQueryString();
+     if(qString == null)
+     LOGGER.info(request.getRequestURL().toString());
+     else
+     LOGGER.info(request.getRequestURL() + "    " + qString);
+
+     String authHeader = request.getHeader("Authorization");
+     if (authHeader == null)
+     {
+     boolean non_authenticated_request = (settings.security.non_authn_username != null);
+
+     if(non_authenticated_request)
+     {
+     chain.doFilter(request, response);
+     }
+     else
+     {
+     LOGGER.info("Request without Authorization header, no Principal added");
+     response.sendError(HttpServletResponse.SC_BAD_REQUEST,
+     "No Authorization in HTTP-header. Only authorized requests allowed.");
+     }
+     return;
+     }
+     else
+     {
+
+     if (authHeader.startsWith("Bearer ") && (authHeader.length() > "Bearer ".length()))
+     {
+     LOGGER.info("Request with Authorization header and has Bearer entry");
+
+     String jws = authHeader.substring("Bearer ".length());
+
+     try
+     {
+     VlkbUser user = getUserFromAccessToken(jws);
+
+     HttpServletRequestWrapper requestWithPrincipal
+     = new RequestWithPrincipal(request, user);
+
+     chain.doFilter(requestWithPrincipal, response);
+     return;
+
+     }
+     catch (JwtException | InvalidTokenException ex)
+     {
+     LOGGER.warning("Token invalid: " + ex.toString());
+     response.sendError(HttpServletResponse.SC_BAD_REQUEST, "Token invalid");
+     return;
+     }
+     catch (Exception ex)
+     {
+     LOGGER.severe(ex.toString());
+     response.sendError(HttpServletResponse.SC_INTERNAL_SERVER_ERROR,
+     "Error during authorization");
+     return;
+     }
+
+     }
+     else
+     {
+     LOGGER.warning("Request with Authorization header but without Bearer token");
+
+     response.sendError(HttpServletResponse.SC_BAD_REQUEST,
+     "Only Bearer authorization supported or token missing");
+   return;
+     }
+
+     }
+     }
+*/
+
+
+   /*/ validate and parse the token
+
+     private List<String> parseScopes(Claims claims)
+     {
+     String scopeStr = (String)claims.get("scope");
+     List<String> scopes = new ArrayList<String>(Arrays.asList(scopeStr.split(" ")));
+
+     if (scopes.stream().anyMatch(s -> s.startsWith("storage.read:")))
+     {
+     return scopes;
+     }
+     else
+     {
+     final String AUTH_ERR = "Invalid token: missing storage.read scope: " + scopeStr;
+     LOGGER.warning(AUTH_ERR);
+     throw new InvalidTokenException(AUTH_ERR);
+     }
+     }
+
+
+
+     VlkbUser getUserFromAccessToken(String jwsString)
+   //throws JwtException, InvalidTokenException  <-- FIXME
+   {
+   long clockSkew = 3 * 60; //3 minutes FIXME get from Config file
+
+   Jws<Claims> jws = Jwts
+   .parser()
+   //.setAllowedClockSkewSeconds(clockSkew) // FIXME needed ?
+   .setSigningKeyResolver(new IamSigningKeyResolver(keysUrl))
+   .build()
+   .parseClaimsJws(jwsString);
+
+   Claims claims = jws.getBody();
+
+   LOGGER.info("scope: " + (String)claims.get("scope"));
+
+   List<String> scopes = parseScopes(claims);
+
+   String storageReadScope = "";
+
+   for(int i=0;i<scopes.size();i++)
+   {
+   if(scopes.get(i).startsWith("storage.read:"))
+   {
+   storageReadScope = scopes.get(i);
+   }
+   }
+
+   LOGGER.info("storage.read: " + storageReadScope);
+
+   String path = storageReadScope.substring(storageReadScope.lastIndexOf(":") + 1);
+
+   LOGGER.info("path: " + path);
+
+   // set User/Principal
+
+   VlkbUser user = new VlkbUser();
+   user.setAccessToken(jwsString);
+   user.setExpirationTime(0);//FIXME
+   user.setUserId((String) claims.get("sub"));
+   user.setUserLabel((String) claims.get("name"));
+   user.setGroups(scopes); // FIXME temp store scopes where roles were in neanias
+
+   return user;
+   }
+
+
+
+
+
+   private static class RequestWithPrincipal extends HttpServletRequestWrapper
+   {
+      private final VlkbUser user;
+
+      public RequestWithPrincipal(HttpServletRequest request, VlkbUser user)
+      {
+         super(request);
+         this.user = user;
+      }
+
+      @Override
+      public Principal getUserPrincipal() {
+         return user;
+      }
+   }
+*/
+
+
+
+
+   /*
+      private boolean isMapStringObject(Object obj)
+      {
+      if(obj instanceof Map)
+      {
+      Map map = (Map)obj;
+
+      Set<?> s = map.keySet();
+      Iterator<?> it = s.iterator();
+      while(it.hasNext())
+      {
+      Object el = it.next();
+      if(! (el instanceof String))
+      {
+      return false;
+      }
+      }
+
+      return true;
+      }
+      else
+      return false;
+      }
+      */
+
+
+}
+
diff --git a/auth/src/main/java/IamTokenSettings.java b/auth/src/main/java/IamTokenSettings.java
new file mode 100644
index 0000000000000000000000000000000000000000..a5ec5a2d3f2f5c8084851bdc90153f92eda83aef
--- /dev/null
+++ b/auth/src/main/java/IamTokenSettings.java
@@ -0,0 +1,106 @@
+
+import java.util.logging.Logger;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Properties;
+import java.io.PrintWriter;
+
+/* for Csv-loadSubsurveys * /
+import com.opencsv.*;
+import com.opencsv.exceptions.*;
+import java.io.FileReader;
+import java.io.FileNotFoundException;
+import java.util.Map;
+import java.util.List;
+import java.util.ArrayList;
+*/
+
+
+
+class IamTokenSettings
+{
+   private static final Logger LOGGER = Logger.getLogger("IamTokenSettings");
+
+   static final String VLKB_PROPERTIES = "iamtoken.properties";
+
+   public static class Security
+   {
+      String jwksEndpoint;
+      String introspectEndpoint;
+      String clientName;
+      String clientPassword;
+      String resourceId;
+
+      String non_authn_username = null;
+   }
+
+   public Security security;
+
+
+   // will not start without config-file
+   // no reasonable code-defaults can be invented
+   public static IamTokenSettings getInstance()
+   {
+      try
+      {
+         InputStream ins =
+            IamTokenSettings.class.getClassLoader().getResourceAsStream(VLKB_PROPERTIES);
+
+         if (ins != null)
+         {
+            Properties properties = new Properties();
+            properties.load(ins);
+
+            Security  security = loadSecurity(properties);
+
+            return new IamTokenSettings(security);
+         }
+         else
+         {
+            throw new IllegalStateException(VLKB_PROPERTIES + " not found in classpath");
+         }
+
+      }
+      catch(IOException ex)
+      {
+         throw new IllegalStateException("Error while loading " + VLKB_PROPERTIES + " file", ex);
+      }
+   }
+
+   public String getIntrospectUrl()  { return this.security.introspectEndpoint; }
+   public String getClientName()     { return this.security.clientName; }
+   public String getClientPassword() { return this.security.clientPassword; }
+
+/* FIXME all  fail if reurned string is null  */
+
+
+   private IamTokenSettings(Security security)
+   {
+      this.security  = security;
+   }
+
+
+   private static Security loadSecurity(Properties properties)
+   {
+      Security security = new IamTokenSettings.Security();
+      security.jwksEndpoint = getPropertyStriped(properties, "jwks_url");
+      security.introspectEndpoint = getPropertyStriped(properties, "introspect");
+      security.clientName = getPropertyStriped(properties, "client_name");
+      security.clientPassword = getPropertyStriped(properties, "client_password");
+      security.resourceId = getPropertyStriped(properties, "resource_id");
+      security.non_authn_username = getPropertyStriped(properties, "non_authenticated_username");
+      return security;
+   }
+
+
+   private static String getPropertyStriped(Properties properties, String setting)
+   {
+      String st = properties.getProperty(setting);
+      if(st != null) return st.strip();
+      else return st;
+   }
+
+
+}
+
diff --git a/auth/src/main/java/IntrospectResponse.java b/auth/src/main/java/IntrospectResponse.java
new file mode 100644
index 0000000000000000000000000000000000000000..4ac4b8b0e519cf051c34e6f47683a944bcfd1174
--- /dev/null
+++ b/auth/src/main/java/IntrospectResponse.java
@@ -0,0 +1,111 @@
+
+import java.util.logging.Logger;
+
+// 1. Https
+import java.net.URL;
+import java.io.*;
+import javax.net.ssl.HttpsURLConnection;
+
+// 2. json deser
+//import org.codehaus.jackson.map.ObjectMapper;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.JsonNode;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ArrayNode;
+import com.fasterxml.jackson.databind.node.JsonNodeFactory;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
+import com.fasterxml.jackson.annotation.JsonAutoDetect;
+
+class IntrospectResponse
+{
+   private static final Logger LOGGER = Logger.getLogger("IntrospectResponse");
+
+   public boolean active;
+   public String  scope;
+
+   public IntrospectResponse(String uPass, String url, String token) throws Exception
+   {
+      final String POST_PARAM = "token=" + token;
+      String resp = doHttps(uPass, url, POST_PARAM);
+      decodeIRespJson(resp);
+   }
+
+   public boolean isTokenActive() { return active; }
+
+   public String getPathFromStorageReadScope()
+   {
+      if(scope == null)
+      {
+         throw new IllegalStateException("Introspect Response has scope = null. Probably token not active.");
+      }
+      else
+      {
+         String[] scopes = scope.split(" ");
+         for(String scope : scopes)
+         {
+            if(scope.startsWith("storage.read:"))
+            {
+               return scope.substring("storage.read:".length());
+            }
+         }
+         throw new IllegalStateException(
+               "Introspect Response with 'storage.read' scope expected, but received scope: " + scope);
+      }
+   }
+
+
+
+   private String doHttps(String uPass, String url, String postParams) throws Exception
+   {
+      LOGGER.info("doHttps : " + url);
+
+      URL myUrl = new URL(url);
+      HttpsURLConnection conn = (HttpsURLConnection)myUrl.openConnection();
+      conn.setRequestMethod("POST");
+
+      String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(uPass.getBytes());
+
+      conn.setRequestProperty ("Authorization", basicAuth);
+
+
+      conn.setDoOutput(true);
+      OutputStream os = conn.getOutputStream();
+      os.write(postParams.getBytes());
+      os.flush();
+      os.close();
+
+      InputStream is = conn.getInputStream();
+      InputStreamReader isr = new InputStreamReader(is);
+      BufferedReader br = new BufferedReader(isr);
+
+      String inputLine;
+      String jsonKeys = "";
+      while ((inputLine = br.readLine()) != null) {
+         jsonKeys = jsonKeys + inputLine;
+      }
+
+      br.close();
+
+      return jsonKeys;
+   }
+   /*
+      @JsonAutoDetect(fieldVisibility=JsonAutoDetect.Visibility.ANY);
+      static class IResp
+      {
+      boolean active;
+      String[] scope;
+      }
+      */
+   private void decodeIRespJson(String json) throws IOException
+   {
+      ObjectMapper mapper = new ObjectMapper();
+      active = mapper.readTree(json).get("active").asBoolean();
+      if(active)
+      {
+         scope = mapper.readTree(json).get("scope").asText();
+      }
+   }
+
+}
diff --git a/auth/src/main/java/InvalidTokenException.java b/auth/src/main/java/InvalidTokenException.java
new file mode 100644
index 0000000000000000000000000000000000000000..c248c9ffed6eea80fb76469218f5cbce55edb057
--- /dev/null
+++ b/auth/src/main/java/InvalidTokenException.java
@@ -0,0 +1,10 @@
+
+
+public class InvalidTokenException extends RuntimeException
+{
+   public InvalidTokenException(String message)
+   {
+      super(message);
+   }
+}
+
diff --git a/auth/src/main/java/Ivoid.java b/auth/src/main/java/Ivoid.java
new file mode 100644
index 0000000000000000000000000000000000000000..5b36ade61214ebecd4295e60411d715b91257109
--- /dev/null
+++ b/auth/src/main/java/Ivoid.java
@@ -0,0 +1,30 @@
+
+
+class Ivoid
+{
+   private String localPart;
+
+   public Ivoid(String ivoid)
+   {
+      if(ivoid.startsWith("ivo://"))
+      {
+         int lastQ = ivoid.lastIndexOf("?");
+         if( lastQ < 0 )
+         {
+            throw new IllegalArgumentException("IVOID must contain '?' but none found in: " + ivoid);
+         }
+         else
+         {
+            localPart = ivoid.substring(lastQ + 1);// +1: skip '?'
+            // throws IndexOutOfBoundsException
+            // if lastQ = str.length -> returns "" (empty string)
+         }
+      }
+      else
+      {
+         throw new IllegalArgumentException("IVOID must start with 'ivo://' but it is: " + ivoid);
+      }
+   }
+
+   public String getLocalPart(){ return localPart; }
+}
diff --git a/auth/src/main/java/NeaSigningKeyResolver.java b/auth/src/main/java/NeaSigningKeyResolver.java
new file mode 100644
index 0000000000000000000000000000000000000000..72af2343a183b24d4c8e9977fd0823ad4fb05200
--- /dev/null
+++ b/auth/src/main/java/NeaSigningKeyResolver.java
@@ -0,0 +1,191 @@
+
+// 1. HTTPS
+import java.net.URL;
+import java.io.*;
+import javax.net.ssl.HttpsURLConnection;
+
+// 2. json deser
+//import org.codehaus.jackson.map.ObjectMapper;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.JsonNode;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.node.ArrayNode;
+import com.fasterxml.jackson.databind.node.JsonNodeFactory;
+import com.fasterxml.jackson.databind.node.ObjectNode;
+import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
+import com.fasterxml.jackson.annotation.JsonAutoDetect;
+
+
+// 3, extract PublicKey
+import java.util.Base64;
+import java.io.ByteArrayInputStream;
+import java.security.GeneralSecurityException; 
+import java.security.PublicKey; 
+import java.security.Signature; 
+import java.security.cert.CertificateFactory; 
+import java.security.cert.X509Certificate; 
+
+// 4, validate token
+import java.security.spec.InvalidKeySpecException;
+import java.security.NoSuchAlgorithmException;
+import java.security.Key;
+import java.security.PublicKey;
+import io.jsonwebtoken.Header;
+import io.jsonwebtoken.Claims;
+import io.jsonwebtoken.Jwt;
+import io.jsonwebtoken.Jws;
+import io.jsonwebtoken.JwsHeader;
+import io.jsonwebtoken.Jwts;
+import io.jsonwebtoken.jackson.io.JacksonDeserializer;
+import io.jsonwebtoken.SigningKeyResolverAdapter;
+
+// only dbg: when keys taken from file, not URL
+import java.nio.file.Files;
+import java.nio.file.Paths;
+
+import java.util.logging.Logger;
+
+public class NeaSigningKeyResolver extends SigningKeyResolverAdapter
+{
+   private static final Logger LOGGER = Logger.getLogger(NeaSigningKeyResolver.class.getName());
+
+   // FIXME to config file
+   //final String pubkeyURL = "https://sso.neanias.eu/auth/realms/neanias-development";
+   //final String keysURL = pubkeyURL + "/protocol/openid-connect/certs";
+   // OR:
+   // final String realmName = "skao-devel"; --> url= .../realms/ + realm + /protocol/openid-connect/...
+   String pubkeyURL = "https://sso.neanias.eu/auth/realms/neanias-production";
+   String keysURL = pubkeyURL + "/protocol/openid-connect/certs";
+   // from ESc email: https://sso.neanias.eu/auth/realms/neanias-production/protocol/openid-connect/auth
+
+   NeaSigningKeyResolver(String keysUrl) {this.keysURL = keysUrl;}
+
+   private String doHttps() throws Exception
+   {
+      LOGGER.info("doHttps : " + keysURL);
+
+      URL myUrl = new URL(keysURL);
+      HttpsURLConnection conn = (HttpsURLConnection)myUrl.openConnection();
+      InputStream is = conn.getInputStream();
+      InputStreamReader isr = new InputStreamReader(is);
+      BufferedReader br = new BufferedReader(isr);
+
+      String inputLine;
+      String jsonKeys = ""; 
+      while ((inputLine = br.readLine()) != null) {
+         jsonKeys = jsonKeys + inputLine;
+      }
+
+      br.close();
+
+      return jsonKeys;
+   }
+
+
+
+
+   // deserialize keys
+
+   @JsonAutoDetect(fieldVisibility=JsonAutoDetect.Visibility.ANY)
+   static class NeaKey 
+   {
+      String kid;
+      String kty;
+      String alg;
+      String use;
+      String n;
+      String e;
+      String[] x5c;
+      String x5t;
+      @JsonProperty("x5t#S256") String x5t_S256;
+      }
+
+
+   private String getCertFromKeys(String jsonKeys, String keyId)
+         throws JsonProcessingException, GeneralSecurityException, IOException
+      {
+         LOGGER.info( "NeaSigningKeyResolver::getCertFromKeys");
+
+         ObjectMapper mapper = new ObjectMapper();
+
+         String cert = null;
+
+         JsonNode keysNode = mapper.readTree(jsonKeys).get("keys");
+         if(keysNode.isArray())
+         {
+            for (JsonNode node : keysNode)
+            {
+               String nodeContent = mapper.writeValueAsString(node);
+               NeaKey key = mapper.readValue(nodeContent,NeaKey.class);
+
+               LOGGER.info("keyId    : " + keyId
+                     +"\nKey::kid : " + key.kid);
+
+               if(keyId.equals(key.kid))
+               {
+                  cert = key.x5c[0];
+               }
+            }
+         }
+
+         return cert;
+      }
+
+
+
+
+   private PublicKey getPublicKeyFromPemCert(String certBase64)
+         throws GeneralSecurityException
+      {
+         LOGGER.info( "NeaSigningKeyResolver::getPublicKeyFromPemCert");
+
+         CertificateFactory fac = CertificateFactory.getInstance("X509");
+         ByteArrayInputStream in = new ByteArrayInputStream(Base64.getDecoder().decode(certBase64));
+         X509Certificate cert = (X509Certificate)fac.generateCertificate(in);
+         return cert.getPublicKey();
+      }
+
+
+
+
+   private Key lookupVerificationKey(String keyId)
+         throws Exception, GeneralSecurityException
+      {
+         LOGGER.info( "NeaSigningKeyResolver::lookupVerificationKey" );
+
+         String jsonKeys = doHttps();
+
+         String cert = getCertFromKeys(jsonKeys, keyId);
+
+         PublicKey pubKey = (PublicKey)getPublicKeyFromPemCert(cert);
+
+         return pubKey;
+      }
+
+
+
+
+   @Override
+   public Key resolveSigningKey(JwsHeader jwsHeader, Claims claims)
+   {
+      LOGGER.info( "NeaSigningKeyResolver::resolveSigningKey" );
+
+      //inspect the header or claims, lookup and return the signing key
+
+      String keyId = jwsHeader.getKeyId(); //or any other field that you need to inspect
+
+      Key key = null;
+      try
+      {
+         key = lookupVerificationKey(keyId); //implement me
+      }
+      catch(Exception e)
+      {
+         e.printStackTrace();
+      }
+
+      return key;
+   }
+}
+
diff --git a/auth/src/main/java/NeaTokenFilter.java b/auth/src/main/java/NeaTokenFilter.java
new file mode 100644
index 0000000000000000000000000000000000000000..f24dc4c3ab688d2c096083e403991d97af58a3af
--- /dev/null
+++ b/auth/src/main/java/NeaTokenFilter.java
@@ -0,0 +1,241 @@
+
+import io.jsonwebtoken.JwtException;
+import io.jsonwebtoken.InvalidClaimException;
+import io.jsonwebtoken.Header;
+import io.jsonwebtoken.Claims;
+import io.jsonwebtoken.Jwt;
+import io.jsonwebtoken.Jws;
+import io.jsonwebtoken.Jwts;
+import io.jsonwebtoken.jackson.io.JacksonDeserializer;
+
+import java.security.spec.InvalidKeySpecException;
+import java.security.NoSuchAlgorithmException;
+
+import javax.servlet.http.HttpServletRequestWrapper;
+import java.security.Principal;
+
+import java.io.IOException;
+import java.util.List; // ArrayList<String>
+import java.util.Map;
+import java.util.HashMap;
+import java.util.*;
+
+import java.util.logging.Logger;
+import javax.servlet.Filter;
+import javax.servlet.FilterChain;
+import javax.servlet.FilterConfig;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+
+//import NeaSigningKeyResolver;
+
+public class NeaTokenFilter implements Filter
+{
+   private static final Logger LOGGER = Logger.getLogger("NeaTokenFilter");
+   private static final NeaTokenSettings settings = NeaTokenSettings.getInstance();
+
+   final String resourceId = settings.security.resourceId; //"vlkb"
+   //final String realmName = "neanias-production";
+   //final String keysUrl = "https://sso.neanias.eu/auth/realms/" + realmName + "/protocol/openid-connect/certs";
+   final String keysUrl = settings.security.jwksEndpoint;
+
+   @Override
+   public void init(FilterConfig fc) throws ServletException {}
+
+   @Override
+   public void destroy() {}
+
+   @Override
+   public void doFilter(ServletRequest req, ServletResponse res, FilterChain chain)
+   throws IOException, ServletException
+{
+   HttpServletRequest request = (HttpServletRequest) req;
+   HttpServletResponse response = (HttpServletResponse) res;
+
+   String  qString = request.getQueryString();
+   if(qString == null)
+      LOGGER.info(request.getRequestURL().toString());
+   else
+      LOGGER.info(request.getRequestURL() + "    " + qString);
+
+   String authHeader = request.getHeader("Authorization");
+   if (authHeader == null)
+   {
+      boolean non_authenticated_request = (settings.security.non_authn_username != null);
+ 
+      if(non_authenticated_request)
+      {
+         chain.doFilter(request, response);
+      }
+      else
+      {
+         LOGGER.info("Request without Authorization header, no Principal added");
+         response.sendError(HttpServletResponse.SC_BAD_REQUEST,
+               "No Authorization in HTTP-header. Only authorized requests allowed.");
+      }
+      return;
+   }
+   else
+   {
+
+      if (authHeader.startsWith("Bearer ") && (authHeader.length() > "Bearer ".length()))
+      {
+         LOGGER.info("Request with Authorization header and has Bearer entry");
+
+         String jws = authHeader.substring("Bearer ".length());
+
+         try
+         {
+            VlkbUser user = getUserFromAccessToken(jws);
+
+            HttpServletRequestWrapper requestWithPrincipal
+               = new RequestWithPrincipal(request, user);
+
+            chain.doFilter(requestWithPrincipal, response);
+            return;
+
+         }
+         catch (JwtException | InvalidTokenException ex)
+         {
+            LOGGER.warning("Token invalid: " + ex.toString());
+            response.sendError(HttpServletResponse.SC_BAD_REQUEST, "Token invalid");
+            return;
+         }
+         catch (Exception ex)
+         {
+            LOGGER.severe(ex.toString());
+            response.sendError(HttpServletResponse.SC_INTERNAL_SERVER_ERROR,
+                  "Error during authorization");
+            return;
+         }
+
+      }
+      else
+      {
+         LOGGER.warning("Request with Authorization header but without Bearer token");
+
+         response.sendError(HttpServletResponse.SC_BAD_REQUEST,
+               "Only Bearer authorization supported or token missing");
+         return;
+      }
+
+   }
+}
+
+
+private boolean isMapStringObject(Object obj)
+{
+   if(obj instanceof Map)
+   {
+      Map map = (Map)obj;
+
+      Set<?> s = map.keySet();
+      Iterator<?> it = s.iterator();
+      while(it.hasNext())
+      {
+         Object el = it.next();
+         if(! (el instanceof String))
+         {
+            return false;
+         }
+      }
+
+      return true;
+   }
+   else
+      return false;
+}
+
+// validate and parse the token
+
+private List<String> validateAndParseRoles(Claims claims)
+{
+   Map mapobj = null;
+
+   Object obj = claims.get("resource_access");
+   if(isMapStringObject(obj))
+      //if(obj instanceof Map<?,?>)
+      mapobj = (Map)obj;//claims.get("resource_access");
+
+   //////////////////////////////////////////////////////////////////////
+
+   Map<String,Object> resourceAccess = (Map<String, Object>)claims.get("resource_access");
+
+   if(resourceAccess != null)
+   {
+
+      Map<String, Object> resource = (Map<String, Object>)resourceAccess.get(resourceId);
+      if (resource != null)
+      {
+         List<String> roles = (List<String>)resource.get("roles");
+         return roles;
+      }
+      else
+      {
+         LOGGER.warning("Token invalid: 'resource_access' must have value: " + resourceId);
+         throw new InvalidTokenException(
+               "missing 'resource_access' must have value: " + resourceId);
+      }
+   }
+   else
+   {
+      LOGGER.warning("Token invalid: missing 'resource_access' claim");
+      throw new InvalidTokenException("missing 'resource_access' claim");
+   }
+}
+
+
+
+VlkbUser getUserFromAccessToken(String jwsString)
+   //throws JwtException, InvalidTokenException  <-- FIXME
+{
+   long clockSkew = 3 * 60; //3 minutes FIXME get from Config file
+
+   Jws<Claims> jws = Jwts
+      .parser()
+      .setAllowedClockSkewSeconds(clockSkew) // FIXME needed ?
+      .setSigningKeyResolver(new NeaSigningKeyResolver(keysUrl))
+      .build()
+      .parseClaimsJws(jwsString);
+
+   Claims claims = jws.getBody();
+
+   List<String> roles = validateAndParseRoles(claims);
+
+   // set User/Principal
+
+   VlkbUser user = new VlkbUser();
+   user.setAccessToken(jwsString);
+   user.setExpirationTime(0);//FIXME
+   user.setUserId((String) claims.get("sub"));
+   user.setUserLabel((String) claims.get("name"));
+   user.setGroups(roles);
+
+   return user;
+}
+
+
+
+
+
+private static class RequestWithPrincipal extends HttpServletRequestWrapper
+{
+   private final VlkbUser user;
+
+   public RequestWithPrincipal(HttpServletRequest request, VlkbUser user)
+   {
+      super(request);
+      this.user = user;
+   }
+
+   @Override
+   public Principal getUserPrincipal() {
+      return user;
+   }
+}
+
+}
+
diff --git a/auth/src/main/java/NeaTokenSettings.java b/auth/src/main/java/NeaTokenSettings.java
new file mode 100644
index 0000000000000000000000000000000000000000..ce71c442a1d34e917e403ccab445f983f884a5b0
--- /dev/null
+++ b/auth/src/main/java/NeaTokenSettings.java
@@ -0,0 +1,98 @@
+
+import java.util.logging.Logger;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Properties;
+import java.io.PrintWriter;
+
+/* for Csv-loadSubsurveys * /
+import com.opencsv.*;
+import com.opencsv.exceptions.*;
+import java.io.FileReader;
+import java.io.FileNotFoundException;
+import java.util.Map;
+import java.util.List;
+import java.util.ArrayList;
+*/
+
+
+
+class NeaTokenSettings
+{
+   private static final Logger LOGGER = Logger.getLogger("NeaTokenSettings");
+
+   static final String VLKB_PROPERTIES = "neatoken.properties";
+
+   public static class Security
+   {
+      String jwksEndpoint;
+      String resourceId;
+
+      String non_authn_username = null;
+   }
+
+   public Security security;
+
+
+   // will not start without config-file
+   // no reasonable code-defaults can be invented
+   public static NeaTokenSettings getInstance()
+   {
+      try
+      {
+         InputStream ins =
+            NeaTokenSettings.class.getClassLoader().getResourceAsStream(VLKB_PROPERTIES);
+
+         if (ins != null)
+         {
+            Properties properties = new Properties();
+            properties.load(ins);
+
+            Security  security  = loadSecurity(properties);
+
+            return new NeaTokenSettings(security);
+         }
+         else
+         {
+            throw new IllegalStateException(VLKB_PROPERTIES + " not found in classpath");
+         }
+
+      }
+      catch(IOException ex)
+      {
+         throw new IllegalStateException("Error while loading " + VLKB_PROPERTIES + " file", ex);
+      }
+   }
+
+
+
+/* FIXME all  fail if reurned string is null  */
+
+
+   private NeaTokenSettings(Security security)
+   {
+      this.security  = security;
+   }
+
+
+   private static Security loadSecurity(Properties properties)
+   {
+      Security security = new NeaTokenSettings.Security();
+      security.jwksEndpoint = getPropertyStriped(properties, "jwks_url");
+      security.resourceId = getPropertyStriped(properties, "resource_id");
+      security.non_authn_username = getPropertyStriped(properties, "non_authenticated_username");
+      return security;
+   }
+
+
+   private static String getPropertyStriped(Properties properties, String setting)
+   {
+      String st = properties.getProperty(setting);
+      if(st != null) return st.strip();
+      else return st;
+   }
+
+
+}
+
diff --git a/auth/src/main/java/VlkbUser.java b/auth/src/main/java/VlkbUser.java
new file mode 100644
index 0000000000000000000000000000000000000000..6691c03693d6aa541c2ff6e9ea89287d3dfba768
--- /dev/null
+++ b/auth/src/main/java/VlkbUser.java
@@ -0,0 +1,96 @@
+
+import java.security.Principal;
+import java.util.List;
+
+public class VlkbUser implements Principal {
+
+    private String userId;
+    private String userLabel;
+    private String accessToken;
+    private String idToken;
+    private String refreshToken;
+    private long expirationTime;
+    private List<String> groups;
+
+    @Override
+    public String getName() {
+        return userId;
+    }   
+
+    public VlkbUser setUserId(String userId) {
+        this.userId = userId;
+        return this;
+    }   
+
+    public String getUserLabel() {
+        return userLabel;
+    }   
+
+    public VlkbUser setUserLabel(String userLabel) {
+        this.userLabel = userLabel;
+        return this;
+    }   
+
+    public String getAccessToken() {
+        return accessToken;
+    }   
+
+    public VlkbUser setAccessToken(String accessToken) {
+        this.accessToken = accessToken;
+        return this;
+    }   
+
+    public VlkbUser setRefreshToken(String refreshToken) {
+        this.refreshToken = refreshToken;
+        return this;
+    }   
+
+    public String getRefreshToken() {
+        return refreshToken;
+    }   
+
+    public String getIdToken() {
+        return idToken;
+    }   
+
+    public VlkbUser setIdToken(String idToken) {
+        this.idToken = idToken;
+        return this;
+    }
+
+    public long getExpirationTime() {
+        return expirationTime;
+    }
+
+    public VlkbUser setExpirationTime(long expirationTime) {
+        this.expirationTime = expirationTime;
+        return this;
+    }
+
+    public boolean isTokenExpired() {
+        return getExpiresIn() < 0;
+    }
+
+    public long getExpiresIn() {
+        return expirationTime - System.currentTimeMillis() / 1000;
+    }
+
+    public VlkbUser setExpiresIn(int expiresIn) {
+        this.expirationTime = System.currentTimeMillis() / 1000 + expiresIn;
+        return this;
+    }
+
+    public List<String> getGroups() {
+        return groups;
+    }
+
+    public String[] getGroupsAsArray() {
+        return groups.toArray(new String[0]);
+    }
+
+
+    public void setGroups(List<String> groups) {
+        this.groups = groups;
+    }
+}
+
diff --git a/auth/src/test/java/Main.java b/auth/src/test/java/Main.java
new file mode 100644
index 0000000000000000000000000000000000000000..a97f4542c66c8f969c3bd487fcb2d19b22d41c07
--- /dev/null
+++ b/auth/src/test/java/Main.java
@@ -0,0 +1,43 @@
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+
+
+class Main
+{
+   public static void main(String[] args)
+   {
+      System.out.println("calling Introspect");
+
+      final String tokenFilename = args[0]==null ? "token.base64" : args[0];
+
+      final String USER = "02cc260f-9837-4907-b2cb-a1a2d764fb15";
+      final String PWD  = "AJMi3qrB6AHRp_6y55tEwU-IpJ8uZ6X4QXeQ3W4la6dc-BlkzAY1OQpAE9hb1W7-VfYl4208FUtjE2Cl3hUYLkQ";
+      final String IEP = "https://iam-escape.cloud.cnaf.infn.it/introspect";
+
+      try
+      {
+         final String TOKEN = new String(Files.readAllBytes(Paths.get(tokenFilename)));
+
+         IntrospectResponse ir = new IntrospectResponse(USER+":"+PWD, IEP, TOKEN);
+
+         System.out.println("active: " +  ir.active );
+         System.out.println("scope : " +  ir.scope );
+         if(ir.active)
+            System.out.println("IR: " + ir.getPathFromStorageReadScope());
+      }
+      catch(IOException e)
+      {
+         System.out.println( "EXCPT: " + e.getMessage() );
+         e.printStackTrace();
+      }
+      catch(Exception e)
+      {
+         System.out.println( "EXCPT: " + e.getMessage() );
+         e.printStackTrace();
+      }
+
+
+
+   }
+}
diff --git a/data-access/engine/Makefile b/data-access/engine/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..bb728ded32f7de86da2dd5e2ede7ecf8f262968a
--- /dev/null
+++ b/data-access/engine/Makefile
@@ -0,0 +1,73 @@
+
+PREFIX ?= /usr/local
+
+AMQP_QUEUE ?= vlkbdevel
+INST_PATH ?= $(PREFIX)
+VERSION ?= $(shell git describe)
+
+
+.PHONY: build install uninstall run stop clean
+
+all : build
+
+.PHONY: build
+build:
+	make -C src/common VERSION=$(VERSION)
+	make -C src/vlkbd VERSION=$(VERSION)
+	make -C src/vlkb-obscore VERSION=$(VERSION)
+	make -C src/vlkb VERSION=$(VERSION)
+
+.PHONY: clean
+clean:
+	make -C src/common clean
+	make -C src/vlkbd clean
+	make -C src/vlkb-obscore clean
+	make -C src/vlkb clean
+
+
+
+.PHONY: install
+install:
+	mkdir -p $(INST_PATH)/bin
+	install ./src/vlkb/bin/vlkb $(INST_PATH)/bin
+	install ./src/vlkb-obscore/bin/vlkb-obscore $(INST_PATH)/bin
+	install ./src/vlkbd/bin/vlkbd $(INST_PATH)/bin
+	install vlkbd_exec.sh $(INST_PATH)/bin
+
+.PHONY: uninstall
+uninstall:
+	rm -f $(INST_PATH)/bin/vlkb
+	rm -f $(INST_PATH)/bin/vlkb-obscore
+	rm -f $(INST_PATH)/bin/vlkbd
+	rm -f $(INST_PATH)/bin/vlkbd_exec.sh
+
+
+
+
+# vlkb-devel site local
+
+.PHONY: config
+config:
+	mkdir -p $(INST_PATH)/etc/vlkb-obscore
+	mkdir -p $(INST_PATH)/etc/vlkbd
+	cp config/vlkb-obscore.datasets.conf $(INST_PATH)/etc/vlkb-obscore/datasets.conf
+	cp config/vlkbd.datasets.conf $(INST_PATH)/etc/vlkbd/datasets.conf
+	
+
+.PHONY: start
+start:
+	vlkbd_exec.sh localhost $(AMQP_QUEUE) $(INST_PATH)/etc/vlkbd/datasets.conf
+
+.PHONY: stop
+stop:
+	pkill -f 'vlkbd.* $(AMQP_QUEUE).*$(INST_PATH)/etc/vlkbd/datasets.conf'
+
+.PHONY: status
+status:
+	ps ax | grep vlkbd
+
+.PHONY: reload
+reload: stop start
+
+
+
diff --git a/data-access/engine/config/dbms.conf-localhost b/data-access/engine/config/dbms.conf-localhost
new file mode 100644
index 0000000000000000000000000000000000000000..6b4c6c9194e7bd4c9843708a945f5f781049b47d
--- /dev/null
+++ b/data-access/engine/config/dbms.conf-localhost
@@ -0,0 +1,5 @@
+# DB connection (see PostgreSQL manual 'Connection URIs'
+pg_uri=postgresql://vialactea:ia2vlkb@localhost:5432/vialactea
+pg_schema=datasets
+
+
diff --git a/data-access/engine/config/dbms.conf-pasquale b/data-access/engine/config/dbms.conf-pasquale
new file mode 100644
index 0000000000000000000000000000000000000000..5d3a1ecd68dc942ee4436738356e734691d1bc7a
--- /dev/null
+++ b/data-access/engine/config/dbms.conf-pasquale
@@ -0,0 +1,4 @@
+# DB connection (see PostgreSQL manual 'Connection URIs'
+pg_uri=postgresql://vialactea:ia2vlkb@pasquale.ia2.inaf.it:5432/vialactea
+pg_schema=datasets
+
diff --git a/data-access/engine/config/dbms.conf-pasquale-devel b/data-access/engine/config/dbms.conf-pasquale-devel
new file mode 100644
index 0000000000000000000000000000000000000000..3b441dfa1f7e07fa6cd7335671f183f5fad4d026
--- /dev/null
+++ b/data-access/engine/config/dbms.conf-pasquale-devel
@@ -0,0 +1,4 @@
+# DB connection (see PostgreSQL manual 'Connection URIs'
+pg_uri=postgresql://vialactea:ia2vlkb@pasquale.ia2.inaf.it:5432/vialacteadevel
+pg_schema=datasetsdevel
+
diff --git a/data-access/engine/config/vlkb-obscore.datasets.conf b/data-access/engine/config/vlkb-obscore.datasets.conf
new file mode 100644
index 0000000000000000000000000000000000000000..f01f5836de2fb5a4ff341cd14e0ae284621b5ccc
--- /dev/null
+++ b/data-access/engine/config/vlkb-obscore.datasets.conf
@@ -0,0 +1,20 @@
+# DB connection (see PostgreSQL manual 'Connection URIs'
+pg_uri=postgresql://vialactea:ia2vlkb@localhost:5432/vialactea
+pg_schema=datasets
+
+
+
+# root of path for local access
+fits_path_surveys=/srv/vlkb/surveys
+
+# obs_publisher_did = <obscore publisher> ? <generated-pubdid>
+obscore_publisher=ivo://ia2.inaf.it/vlkb/datasets
+
+# full access URL: <obscore_access_url>/<storage-path>/<file-name>
+obscore_access_url=https://vlkb-devel.ia2.inaf.it:8443/vlkb/datasets/surveys
+obscore_access_format=application/fits
+
+# logging (holds last exec only)
+# log_dir=/tmp
+# log_filename=vlkb-obscore.log
+
diff --git a/data-access/engine/config/vlkbd.datasets.conf b/data-access/engine/config/vlkbd.datasets.conf
new file mode 100644
index 0000000000000000000000000000000000000000..c534e77f3d25db9909c36d8722c7d7a76b243955
--- /dev/null
+++ b/data-access/engine/config/vlkbd.datasets.conf
@@ -0,0 +1,10 @@
+
+# path to original files
+fits_path_surveys=/srv/vlkb/surveys
+# path to generated cutouts
+fits_path_cutouts=/srv/vlkb/cutouts
+
+# logging records last request only
+# log_dir=/tmp
+# log_filename=vlkbd.log
+
diff --git a/data-access/engine/ext/aria-csv/Makefile b/data-access/engine/ext/aria-csv/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..af7eae91360351a49339207ef9c0a087bfb17cc9
--- /dev/null
+++ b/data-access/engine/ext/aria-csv/Makefile
@@ -0,0 +1,12 @@
+
+
+# empty if nothing needs to be build
+
+all:
+
+
+
+clean:
+
+
+
diff --git a/data-access/engine/ext/aria-csv/include/aria-csv-parser/parser.hpp b/data-access/engine/ext/aria-csv/include/aria-csv-parser/parser.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..c1350b82c57bc1c5fb8833903314bf3dddd511c5
--- /dev/null
+++ b/data-access/engine/ext/aria-csv/include/aria-csv-parser/parser.hpp
@@ -0,0 +1,352 @@
+#ifndef ARIA_CSV_H
+#define ARIA_CSV_H
+
+// RBu: from git clone https://github.com/AriaFallah/csv-parser.git
+
+#include <fstream>
+#include <memory>
+#include <stdexcept>
+#include <string>
+#include <vector>
+
+namespace aria {
+  namespace csv {
+    enum class Term { CRLF = -2 };
+    enum class FieldType { DATA, ROW_END, CSV_END };
+    using CSV = std::vector<std::vector<std::string>>;
+
+    // Checking for '\n', '\r', and '\r\n' by default
+    inline bool operator==(const char c, const Term t) {
+      switch (t) {
+        case Term::CRLF:
+          return c == '\r' || c == '\n';
+        default:
+          return static_cast<char>(t) == c;
+      }
+    }
+
+    inline bool operator!=(const char c, const Term t) {
+      return !(c == t);
+    }
+
+    // Wraps returned fields so we can also indicate
+    // that we hit row endings or the end of the csv itself
+    struct Field {
+      explicit Field(FieldType t): type(t), data(nullptr) {}
+      explicit Field(const std::string& str): type(FieldType::DATA), data(&str) {}
+
+      FieldType type;
+      const std::string *data;
+    };
+
+    // Reads and parses lines from a csv file
+    class CsvParser {
+    private:
+      // CSV state for state machine
+      enum class State {
+        START_OF_FIELD,
+        IN_FIELD,
+        IN_QUOTED_FIELD,
+        IN_ESCAPED_QUOTE,
+        END_OF_ROW,
+        EMPTY
+      };
+      State m_state = State::START_OF_FIELD;
+
+      // Configurable attributes
+      char m_none = ' ';
+      char m_quote = '"';
+      char m_delimiter = ',';
+      Term m_terminator = Term::CRLF;
+      std::istream& m_input;
+
+      // Buffer capacities
+      static constexpr int FIELDBUF_CAP = 1024;
+      static constexpr int INPUTBUF_CAP = 1024 * 128;
+
+      // Buffers
+      std::string m_fieldbuf{};
+      std::unique_ptr<char[]> m_inputbuf = std::unique_ptr<char[]>(new char[INPUTBUF_CAP]{});
+
+      // Misc
+      bool m_eof = false;
+      size_t m_cursor = INPUTBUF_CAP;
+      size_t m_inputbuf_size = INPUTBUF_CAP;
+      std::streamoff m_scanposition = -INPUTBUF_CAP;
+    public:
+      // Creates the CSV parser which by default, splits on commas,
+      // uses quotes to escape, and handles CSV files that end in either
+      // '\r', '\n', or '\r\n'.
+      explicit CsvParser(std::istream& input): m_input(input) {
+        // Reserve space upfront to improve performance
+        m_fieldbuf.reserve(FIELDBUF_CAP);
+        if (!m_input.good()) {
+          throw std::runtime_error("Something is wrong with input stream");
+        }
+      }
+
+      // Change the quote character
+      CsvParser&& quote(char c) noexcept {
+        m_quote = c;
+        return std::move(*this);
+      }
+
+      // Change the delimiter character
+      CsvParser&& delimiter(char c) noexcept {
+        m_delimiter = c;
+        return std::move(*this);
+      }
+
+      // Change the terminator character
+      CsvParser&& terminator(char c) noexcept {
+        m_terminator = static_cast<Term>(c);
+        return std::move(*this);
+      }
+
+      // The parser is in the empty state when there are
+      // no more tokens left to read from the input buffer
+      bool empty() {
+        return m_state == State::EMPTY;
+      }
+
+      // Not the actual position in the stream (its buffered) just the
+      // position up to last availiable token
+      std::streamoff position() const
+      {
+          return m_scanposition + static_cast<std::streamoff>(m_cursor);
+      }
+
+      // Reads a single field from the CSV
+      Field next_field() {
+        if (empty()) {
+          return Field(FieldType::CSV_END);
+        }
+        m_fieldbuf.clear();
+
+        // This loop runs until either the parser has
+        // read a full field or until there's no tokens left to read
+        for (;;) {
+          char *maybe_token = top_token();
+
+          // If we're out of tokens to read return whatever's left in the
+          // field and row buffers. If there's nothing left, return null.
+          if (!maybe_token) {
+            m_state = State::EMPTY;
+            return !m_fieldbuf.empty() ? Field(m_fieldbuf) : Field(FieldType::CSV_END);
+          }
+
+          // Parsing the CSV is done using a finite state machine
+          char c = *maybe_token;
+          switch (m_state) {
+            case State::START_OF_FIELD:
+              m_cursor++;
+              if (c == m_terminator) {
+                handle_crlf(c);
+                m_state = State::END_OF_ROW;
+                return Field(m_fieldbuf);
+              }
+
+              if (c == m_quote) {
+                m_state = State::IN_QUOTED_FIELD;
+              } else if (c == m_delimiter) {
+                return Field(m_fieldbuf);
+              } else if(c == m_none) {
+                 ;
+              } else {
+                m_state = State::IN_FIELD;
+                m_fieldbuf += c;
+              }
+
+              break;
+
+            case State::IN_FIELD:
+              m_cursor++;
+              if (c == m_terminator) {
+                handle_crlf(c);
+                m_state = State::END_OF_ROW;
+                return Field(m_fieldbuf);
+              }
+
+              if (c == m_delimiter) {
+                m_state = State::START_OF_FIELD;
+                return Field(m_fieldbuf);
+              } else if(c == m_none) {
+                 ;
+              } else {
+                m_fieldbuf += c;
+              }
+
+              break;
+
+            case State::IN_QUOTED_FIELD:
+              m_cursor++;
+              if (c == m_quote) {
+                m_state = State::IN_ESCAPED_QUOTE;
+              } else {
+                m_fieldbuf += c;
+              }
+
+              break;
+
+            case State::IN_ESCAPED_QUOTE:
+              m_cursor++;
+              if (c == m_terminator) {
+                handle_crlf(c);
+                m_state = State::END_OF_ROW;
+                return Field(m_fieldbuf);
+              }
+
+              if (c == m_quote) {
+                m_state = State::IN_QUOTED_FIELD;
+                m_fieldbuf += c;
+              } else if (c == m_delimiter) {
+                m_state = State::START_OF_FIELD;
+                return Field(m_fieldbuf);
+              } else {
+                m_state = State::IN_FIELD;
+                m_fieldbuf += c;
+              }
+
+              break;
+
+            case State::END_OF_ROW:
+              m_state = State::START_OF_FIELD;
+              return Field(FieldType::ROW_END);
+
+            case State::EMPTY:
+              throw std::logic_error("You goofed");
+          }
+        }
+      }
+    private:
+      // When the parser hits the end of a line it needs
+      // to check the special case of '\r\n' as a terminator.
+      // If it finds that the previous token was a '\r', and
+      // the next token will be a '\n', it skips the '\n'.
+      void handle_crlf(const char c) {
+        if (m_terminator != Term::CRLF || c != '\r') {
+          return;
+        }
+
+        char *token = top_token();
+        if (token && *token == '\n') {
+          m_cursor++;
+        }
+      }
+
+      // Pulls the next token from the input buffer, but does not move
+      // the cursor forward. If the stream is empty and the input buffer
+      // is also empty return a nullptr.
+      char* top_token() {
+        // Return null if there's nothing left to read
+        if (m_eof && m_cursor == m_inputbuf_size) {
+          return nullptr;
+        }
+
+        // Refill the input buffer if it's been fully read
+        if (m_cursor == m_inputbuf_size) {
+          m_scanposition += static_cast<std::streamoff>(m_cursor);
+          m_cursor = 0;
+          m_input.read(m_inputbuf.get(), INPUTBUF_CAP);
+
+          // Indicate we hit end of file, and resize
+          // input buffer to show that it's not at full capacity
+          if (m_input.eof()) {
+            m_eof = true;
+            m_inputbuf_size = m_input.gcount();
+
+            // Return null if there's nothing left to read
+            if (m_inputbuf_size == 0) {
+              return nullptr;
+            }
+          }
+        }
+
+        return &m_inputbuf[m_cursor];
+      }
+    public:
+      // Iterator implementation for the CSV parser, which reads
+      // from the CSV row by row in the form of a vector of strings
+      class iterator {
+      public:
+        using difference_type = std::ptrdiff_t;
+        using value_type = std::vector<std::string>;
+        using pointer = const std::vector<std::string>*;
+        using reference = const std::vector<std::string>&;
+        using iterator_category = std::input_iterator_tag;
+
+        explicit iterator(CsvParser *p, bool end = false): m_parser(p) {
+          if (!end) {
+            m_row.reserve(50);
+            m_current_row = 0;
+            next();
+          }
+        }
+
+        iterator& operator++() {
+          next();
+          return *this;
+        }
+
+        iterator operator++(int) {
+          iterator i = (*this);
+          ++(*this);
+          return i;
+        }
+
+        bool operator==(const iterator& other) const {
+          return m_current_row == other.m_current_row
+            && m_row.size() == other.m_row.size();
+        }
+
+        bool operator!=(const iterator& other) const {
+          return !(*this == other);
+        }
+
+        reference operator*() const {
+          return m_row;
+        }
+
+        pointer operator->() const {
+          return &m_row;
+        }
+      private:
+        value_type m_row{};
+        CsvParser *m_parser;
+        int m_current_row = -1;
+
+        void next() {
+          value_type::size_type num_fields = 0;
+          for (;;) {
+            auto field = m_parser->next_field();
+            switch (field.type) {
+              case FieldType::CSV_END:
+                if (num_fields < m_row.size()) {
+                  m_row.resize(num_fields);
+                }
+                m_current_row = -1;
+                return;
+              case FieldType::ROW_END:
+                if (num_fields < m_row.size()) {
+                  m_row.resize(num_fields);
+                }
+                m_current_row++;
+                return;
+              case FieldType::DATA:
+                if (num_fields < m_row.size()) {
+                  m_row[num_fields] = std::move(*field.data);
+                } else {
+                  m_row.push_back(std::move(*field.data));
+                }
+                num_fields++;
+            }
+          }
+        }
+      };
+
+      iterator begin() { return iterator(this); };
+      iterator end() { return iterator(this, true); };
+    };
+  }
+}
+#endif
diff --git a/data-access/engine/ext/nlohmann-json/Makefile b/data-access/engine/ext/nlohmann-json/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..8ff3217a5c1699081ff0f643610d2c9cf1545527
--- /dev/null
+++ b/data-access/engine/ext/nlohmann-json/Makefile
@@ -0,0 +1,11 @@
+
+
+# empty if nothing needs to be build
+
+all:
+
+
+
+clean:
+
+
diff --git a/data-access/engine/ext/nlohmann-json/include/json.hpp b/data-access/engine/ext/nlohmann-json/include/json.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..5a538880a353d8debfa05eed88eb0db3dbc0f9eb
--- /dev/null
+++ b/data-access/engine/ext/nlohmann-json/include/json.hpp
@@ -0,0 +1,8 @@
+#ifndef JSON_HPP
+#define JSON_HPP
+
+# define JSON_DIAGNOSTICS 1
+#include "nljson/json.hpp"
+
+#endif
+
diff --git a/data-access/engine/ext/nlohmann-json/include/nljson/json.hpp b/data-access/engine/ext/nlohmann-json/include/nljson/json.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..cb27e058119257b63be7e3285ac14814af730b94
--- /dev/null
+++ b/data-access/engine/ext/nlohmann-json/include/nljson/json.hpp
@@ -0,0 +1,22091 @@
+/*
+    __ _____ _____ _____
+ __|  |   __|     |   | |  JSON for Modern C++
+|  |  |__   |  |  | | | |  version 3.10.5
+|_____|_____|_____|_|___|  https://github.com/nlohmann/json
+
+Licensed under the MIT License <http://opensource.org/licenses/MIT>.
+SPDX-License-Identifier: MIT
+Copyright (c) 2013-2022 Niels Lohmann <http://nlohmann.me>.
+
+Permission is hereby  granted, free of charge, to any  person obtaining a copy
+of this software and associated  documentation files (the "Software"), to deal
+in the Software  without restriction, including without  limitation the rights
+to  use, copy,  modify, merge,  publish, distribute,  sublicense, and/or  sell
+copies  of  the Software,  and  to  permit persons  to  whom  the Software  is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE  IS PROVIDED "AS  IS", WITHOUT WARRANTY  OF ANY KIND,  EXPRESS OR
+IMPLIED,  INCLUDING BUT  NOT  LIMITED TO  THE  WARRANTIES OF  MERCHANTABILITY,
+FITNESS FOR  A PARTICULAR PURPOSE AND  NONINFRINGEMENT. IN NO EVENT  SHALL THE
+AUTHORS  OR COPYRIGHT  HOLDERS  BE  LIABLE FOR  ANY  CLAIM,  DAMAGES OR  OTHER
+LIABILITY, WHETHER IN AN ACTION OF  CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE  OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
+*/
+
+/****************************************************************************\
+ * Note on documentation: The source files contain links to the online      *
+ * documentation of the public API at https://json.nlohmann.me. This URL    *
+ * contains the most recent documentation and should also be applicable to  *
+ * previous versions; documentation for deprecated functions is not         *
+ * removed, but marked deprecated. See "Generate documentation" section in  *
+ * file doc/README.md.                                                      *
+\****************************************************************************/
+
+#ifndef INCLUDE_NLOHMANN_JSON_HPP_
+#define INCLUDE_NLOHMANN_JSON_HPP_
+
+#define NLOHMANN_JSON_VERSION_MAJOR 3
+#define NLOHMANN_JSON_VERSION_MINOR 10
+#define NLOHMANN_JSON_VERSION_PATCH 5
+
+#include <algorithm> // all_of, find, for_each
+#include <cstddef> // nullptr_t, ptrdiff_t, size_t
+#include <functional> // hash, less
+#include <initializer_list> // initializer_list
+#ifndef JSON_NO_IO
+    #include <iosfwd> // istream, ostream
+#endif  // JSON_NO_IO
+#include <iterator> // random_access_iterator_tag
+#include <memory> // unique_ptr
+#include <numeric> // accumulate
+#include <string> // string, stoi, to_string
+#include <utility> // declval, forward, move, pair, swap
+#include <vector> // vector
+
+// #include <nlohmann/adl_serializer.hpp>
+
+
+#include <type_traits>
+#include <utility>
+
+// #include <nlohmann/detail/conversions/from_json.hpp>
+
+
+#include <algorithm> // transform
+#include <array> // array
+#include <forward_list> // forward_list
+#include <iterator> // inserter, front_inserter, end
+#include <map> // map
+#include <string> // string
+#include <tuple> // tuple, make_tuple
+#include <type_traits> // is_arithmetic, is_same, is_enum, underlying_type, is_convertible
+#include <unordered_map> // unordered_map
+#include <utility> // pair, declval
+#include <valarray> // valarray
+
+// #include <nlohmann/detail/exceptions.hpp>
+
+
+#include <exception> // exception
+#include <stdexcept> // runtime_error
+#include <string> // to_string
+#include <vector> // vector
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+#include <array> // array
+#include <cstddef> // size_t
+#include <cstdint> // uint8_t
+#include <string> // string
+
+namespace nlohmann
+{
+namespace detail
+{
+///////////////////////////
+// JSON type enumeration //
+///////////////////////////
+
+/*!
+@brief the JSON type enumeration
+
+This enumeration collects the different JSON types. It is internally used to
+distinguish the stored values, and the functions @ref basic_json::is_null(),
+@ref basic_json::is_object(), @ref basic_json::is_array(),
+@ref basic_json::is_string(), @ref basic_json::is_boolean(),
+@ref basic_json::is_number() (with @ref basic_json::is_number_integer(),
+@ref basic_json::is_number_unsigned(), and @ref basic_json::is_number_float()),
+@ref basic_json::is_discarded(), @ref basic_json::is_primitive(), and
+@ref basic_json::is_structured() rely on it.
+
+@note There are three enumeration entries (number_integer, number_unsigned, and
+number_float), because the library distinguishes these three types for numbers:
+@ref basic_json::number_unsigned_t is used for unsigned integers,
+@ref basic_json::number_integer_t is used for signed integers, and
+@ref basic_json::number_float_t is used for floating-point numbers or to
+approximate integers which do not fit in the limits of their respective type.
+
+@sa see @ref basic_json::basic_json(const value_t value_type) -- create a JSON
+value with the default value for a given type
+
+@since version 1.0.0
+*/
+enum class value_t : std::uint8_t
+{
+    null,             ///< null value
+    object,           ///< object (unordered set of name/value pairs)
+    array,            ///< array (ordered collection of values)
+    string,           ///< string value
+    boolean,          ///< boolean value
+    number_integer,   ///< number value (signed integer)
+    number_unsigned,  ///< number value (unsigned integer)
+    number_float,     ///< number value (floating-point)
+    binary,           ///< binary array (ordered collection of bytes)
+    discarded         ///< discarded by the parser callback function
+};
+
+/*!
+@brief comparison operator for JSON types
+
+Returns an ordering that is similar to Python:
+- order: null < boolean < number < object < array < string < binary
+- furthermore, each type is not smaller than itself
+- discarded values are not comparable
+- binary is represented as a b"" string in python and directly comparable to a
+  string; however, making a binary array directly comparable with a string would
+  be surprising behavior in a JSON file.
+
+@since version 1.0.0
+*/
+inline bool operator<(const value_t lhs, const value_t rhs) noexcept
+{
+    static constexpr std::array<std::uint8_t, 9> order = {{
+            0 /* null */, 3 /* object */, 4 /* array */, 5 /* string */,
+            1 /* boolean */, 2 /* integer */, 2 /* unsigned */, 2 /* float */,
+            6 /* binary */
+        }
+    };
+
+    const auto l_index = static_cast<std::size_t>(lhs);
+    const auto r_index = static_cast<std::size_t>(rhs);
+    return l_index < order.size() && r_index < order.size() && order[l_index] < order[r_index];
+}
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/string_escape.hpp>
+
+
+#include <string>
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+#include <utility> // declval, pair
+// #include <nlohmann/thirdparty/hedley/hedley.hpp>
+
+
+/* Hedley - https://nemequ.github.io/hedley
+ * Created by Evan Nemerson <evan@nemerson.com>
+ *
+ * To the extent possible under law, the author(s) have dedicated all
+ * copyright and related and neighboring rights to this software to
+ * the public domain worldwide. This software is distributed without
+ * any warranty.
+ *
+ * For details, see <http://creativecommons.org/publicdomain/zero/1.0/>.
+ * SPDX-License-Identifier: CC0-1.0
+ */
+
+#if !defined(JSON_HEDLEY_VERSION) || (JSON_HEDLEY_VERSION < 15)
+#if defined(JSON_HEDLEY_VERSION)
+    #undef JSON_HEDLEY_VERSION
+#endif
+#define JSON_HEDLEY_VERSION 15
+
+#if defined(JSON_HEDLEY_STRINGIFY_EX)
+    #undef JSON_HEDLEY_STRINGIFY_EX
+#endif
+#define JSON_HEDLEY_STRINGIFY_EX(x) #x
+
+#if defined(JSON_HEDLEY_STRINGIFY)
+    #undef JSON_HEDLEY_STRINGIFY
+#endif
+#define JSON_HEDLEY_STRINGIFY(x) JSON_HEDLEY_STRINGIFY_EX(x)
+
+#if defined(JSON_HEDLEY_CONCAT_EX)
+    #undef JSON_HEDLEY_CONCAT_EX
+#endif
+#define JSON_HEDLEY_CONCAT_EX(a,b) a##b
+
+#if defined(JSON_HEDLEY_CONCAT)
+    #undef JSON_HEDLEY_CONCAT
+#endif
+#define JSON_HEDLEY_CONCAT(a,b) JSON_HEDLEY_CONCAT_EX(a,b)
+
+#if defined(JSON_HEDLEY_CONCAT3_EX)
+    #undef JSON_HEDLEY_CONCAT3_EX
+#endif
+#define JSON_HEDLEY_CONCAT3_EX(a,b,c) a##b##c
+
+#if defined(JSON_HEDLEY_CONCAT3)
+    #undef JSON_HEDLEY_CONCAT3
+#endif
+#define JSON_HEDLEY_CONCAT3(a,b,c) JSON_HEDLEY_CONCAT3_EX(a,b,c)
+
+#if defined(JSON_HEDLEY_VERSION_ENCODE)
+    #undef JSON_HEDLEY_VERSION_ENCODE
+#endif
+#define JSON_HEDLEY_VERSION_ENCODE(major,minor,revision) (((major) * 1000000) + ((minor) * 1000) + (revision))
+
+#if defined(JSON_HEDLEY_VERSION_DECODE_MAJOR)
+    #undef JSON_HEDLEY_VERSION_DECODE_MAJOR
+#endif
+#define JSON_HEDLEY_VERSION_DECODE_MAJOR(version) ((version) / 1000000)
+
+#if defined(JSON_HEDLEY_VERSION_DECODE_MINOR)
+    #undef JSON_HEDLEY_VERSION_DECODE_MINOR
+#endif
+#define JSON_HEDLEY_VERSION_DECODE_MINOR(version) (((version) % 1000000) / 1000)
+
+#if defined(JSON_HEDLEY_VERSION_DECODE_REVISION)
+    #undef JSON_HEDLEY_VERSION_DECODE_REVISION
+#endif
+#define JSON_HEDLEY_VERSION_DECODE_REVISION(version) ((version) % 1000)
+
+#if defined(JSON_HEDLEY_GNUC_VERSION)
+    #undef JSON_HEDLEY_GNUC_VERSION
+#endif
+#if defined(__GNUC__) && defined(__GNUC_PATCHLEVEL__)
+    #define JSON_HEDLEY_GNUC_VERSION JSON_HEDLEY_VERSION_ENCODE(__GNUC__, __GNUC_MINOR__, __GNUC_PATCHLEVEL__)
+#elif defined(__GNUC__)
+    #define JSON_HEDLEY_GNUC_VERSION JSON_HEDLEY_VERSION_ENCODE(__GNUC__, __GNUC_MINOR__, 0)
+#endif
+
+#if defined(JSON_HEDLEY_GNUC_VERSION_CHECK)
+    #undef JSON_HEDLEY_GNUC_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_GNUC_VERSION)
+    #define JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_GNUC_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_MSVC_VERSION)
+    #undef JSON_HEDLEY_MSVC_VERSION
+#endif
+#if defined(_MSC_FULL_VER) && (_MSC_FULL_VER >= 140000000) && !defined(__ICL)
+    #define JSON_HEDLEY_MSVC_VERSION JSON_HEDLEY_VERSION_ENCODE(_MSC_FULL_VER / 10000000, (_MSC_FULL_VER % 10000000) / 100000, (_MSC_FULL_VER % 100000) / 100)
+#elif defined(_MSC_FULL_VER) && !defined(__ICL)
+    #define JSON_HEDLEY_MSVC_VERSION JSON_HEDLEY_VERSION_ENCODE(_MSC_FULL_VER / 1000000, (_MSC_FULL_VER % 1000000) / 10000, (_MSC_FULL_VER % 10000) / 10)
+#elif defined(_MSC_VER) && !defined(__ICL)
+    #define JSON_HEDLEY_MSVC_VERSION JSON_HEDLEY_VERSION_ENCODE(_MSC_VER / 100, _MSC_VER % 100, 0)
+#endif
+
+#if defined(JSON_HEDLEY_MSVC_VERSION_CHECK)
+    #undef JSON_HEDLEY_MSVC_VERSION_CHECK
+#endif
+#if !defined(JSON_HEDLEY_MSVC_VERSION)
+    #define JSON_HEDLEY_MSVC_VERSION_CHECK(major,minor,patch) (0)
+#elif defined(_MSC_VER) && (_MSC_VER >= 1400)
+    #define JSON_HEDLEY_MSVC_VERSION_CHECK(major,minor,patch) (_MSC_FULL_VER >= ((major * 10000000) + (minor * 100000) + (patch)))
+#elif defined(_MSC_VER) && (_MSC_VER >= 1200)
+    #define JSON_HEDLEY_MSVC_VERSION_CHECK(major,minor,patch) (_MSC_FULL_VER >= ((major * 1000000) + (minor * 10000) + (patch)))
+#else
+    #define JSON_HEDLEY_MSVC_VERSION_CHECK(major,minor,patch) (_MSC_VER >= ((major * 100) + (minor)))
+#endif
+
+#if defined(JSON_HEDLEY_INTEL_VERSION)
+    #undef JSON_HEDLEY_INTEL_VERSION
+#endif
+#if defined(__INTEL_COMPILER) && defined(__INTEL_COMPILER_UPDATE) && !defined(__ICL)
+    #define JSON_HEDLEY_INTEL_VERSION JSON_HEDLEY_VERSION_ENCODE(__INTEL_COMPILER / 100, __INTEL_COMPILER % 100, __INTEL_COMPILER_UPDATE)
+#elif defined(__INTEL_COMPILER) && !defined(__ICL)
+    #define JSON_HEDLEY_INTEL_VERSION JSON_HEDLEY_VERSION_ENCODE(__INTEL_COMPILER / 100, __INTEL_COMPILER % 100, 0)
+#endif
+
+#if defined(JSON_HEDLEY_INTEL_VERSION_CHECK)
+    #undef JSON_HEDLEY_INTEL_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_INTEL_VERSION)
+    #define JSON_HEDLEY_INTEL_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_INTEL_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_INTEL_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_INTEL_CL_VERSION)
+    #undef JSON_HEDLEY_INTEL_CL_VERSION
+#endif
+#if defined(__INTEL_COMPILER) && defined(__INTEL_COMPILER_UPDATE) && defined(__ICL)
+    #define JSON_HEDLEY_INTEL_CL_VERSION JSON_HEDLEY_VERSION_ENCODE(__INTEL_COMPILER, __INTEL_COMPILER_UPDATE, 0)
+#endif
+
+#if defined(JSON_HEDLEY_INTEL_CL_VERSION_CHECK)
+    #undef JSON_HEDLEY_INTEL_CL_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_INTEL_CL_VERSION)
+    #define JSON_HEDLEY_INTEL_CL_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_INTEL_CL_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_INTEL_CL_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_PGI_VERSION)
+    #undef JSON_HEDLEY_PGI_VERSION
+#endif
+#if defined(__PGI) && defined(__PGIC__) && defined(__PGIC_MINOR__) && defined(__PGIC_PATCHLEVEL__)
+    #define JSON_HEDLEY_PGI_VERSION JSON_HEDLEY_VERSION_ENCODE(__PGIC__, __PGIC_MINOR__, __PGIC_PATCHLEVEL__)
+#endif
+
+#if defined(JSON_HEDLEY_PGI_VERSION_CHECK)
+    #undef JSON_HEDLEY_PGI_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_PGI_VERSION)
+    #define JSON_HEDLEY_PGI_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_PGI_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_PGI_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_SUNPRO_VERSION)
+    #undef JSON_HEDLEY_SUNPRO_VERSION
+#endif
+#if defined(__SUNPRO_C) && (__SUNPRO_C > 0x1000)
+    #define JSON_HEDLEY_SUNPRO_VERSION JSON_HEDLEY_VERSION_ENCODE((((__SUNPRO_C >> 16) & 0xf) * 10) + ((__SUNPRO_C >> 12) & 0xf), (((__SUNPRO_C >> 8) & 0xf) * 10) + ((__SUNPRO_C >> 4) & 0xf), (__SUNPRO_C & 0xf) * 10)
+#elif defined(__SUNPRO_C)
+    #define JSON_HEDLEY_SUNPRO_VERSION JSON_HEDLEY_VERSION_ENCODE((__SUNPRO_C >> 8) & 0xf, (__SUNPRO_C >> 4) & 0xf, (__SUNPRO_C) & 0xf)
+#elif defined(__SUNPRO_CC) && (__SUNPRO_CC > 0x1000)
+    #define JSON_HEDLEY_SUNPRO_VERSION JSON_HEDLEY_VERSION_ENCODE((((__SUNPRO_CC >> 16) & 0xf) * 10) + ((__SUNPRO_CC >> 12) & 0xf), (((__SUNPRO_CC >> 8) & 0xf) * 10) + ((__SUNPRO_CC >> 4) & 0xf), (__SUNPRO_CC & 0xf) * 10)
+#elif defined(__SUNPRO_CC)
+    #define JSON_HEDLEY_SUNPRO_VERSION JSON_HEDLEY_VERSION_ENCODE((__SUNPRO_CC >> 8) & 0xf, (__SUNPRO_CC >> 4) & 0xf, (__SUNPRO_CC) & 0xf)
+#endif
+
+#if defined(JSON_HEDLEY_SUNPRO_VERSION_CHECK)
+    #undef JSON_HEDLEY_SUNPRO_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_SUNPRO_VERSION)
+    #define JSON_HEDLEY_SUNPRO_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_SUNPRO_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_SUNPRO_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_EMSCRIPTEN_VERSION)
+    #undef JSON_HEDLEY_EMSCRIPTEN_VERSION
+#endif
+#if defined(__EMSCRIPTEN__)
+    #define JSON_HEDLEY_EMSCRIPTEN_VERSION JSON_HEDLEY_VERSION_ENCODE(__EMSCRIPTEN_major__, __EMSCRIPTEN_minor__, __EMSCRIPTEN_tiny__)
+#endif
+
+#if defined(JSON_HEDLEY_EMSCRIPTEN_VERSION_CHECK)
+    #undef JSON_HEDLEY_EMSCRIPTEN_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_EMSCRIPTEN_VERSION)
+    #define JSON_HEDLEY_EMSCRIPTEN_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_EMSCRIPTEN_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_EMSCRIPTEN_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_ARM_VERSION)
+    #undef JSON_HEDLEY_ARM_VERSION
+#endif
+#if defined(__CC_ARM) && defined(__ARMCOMPILER_VERSION)
+    #define JSON_HEDLEY_ARM_VERSION JSON_HEDLEY_VERSION_ENCODE(__ARMCOMPILER_VERSION / 1000000, (__ARMCOMPILER_VERSION % 1000000) / 10000, (__ARMCOMPILER_VERSION % 10000) / 100)
+#elif defined(__CC_ARM) && defined(__ARMCC_VERSION)
+    #define JSON_HEDLEY_ARM_VERSION JSON_HEDLEY_VERSION_ENCODE(__ARMCC_VERSION / 1000000, (__ARMCC_VERSION % 1000000) / 10000, (__ARMCC_VERSION % 10000) / 100)
+#endif
+
+#if defined(JSON_HEDLEY_ARM_VERSION_CHECK)
+    #undef JSON_HEDLEY_ARM_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_ARM_VERSION)
+    #define JSON_HEDLEY_ARM_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_ARM_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_ARM_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_IBM_VERSION)
+    #undef JSON_HEDLEY_IBM_VERSION
+#endif
+#if defined(__ibmxl__)
+    #define JSON_HEDLEY_IBM_VERSION JSON_HEDLEY_VERSION_ENCODE(__ibmxl_version__, __ibmxl_release__, __ibmxl_modification__)
+#elif defined(__xlC__) && defined(__xlC_ver__)
+    #define JSON_HEDLEY_IBM_VERSION JSON_HEDLEY_VERSION_ENCODE(__xlC__ >> 8, __xlC__ & 0xff, (__xlC_ver__ >> 8) & 0xff)
+#elif defined(__xlC__)
+    #define JSON_HEDLEY_IBM_VERSION JSON_HEDLEY_VERSION_ENCODE(__xlC__ >> 8, __xlC__ & 0xff, 0)
+#endif
+
+#if defined(JSON_HEDLEY_IBM_VERSION_CHECK)
+    #undef JSON_HEDLEY_IBM_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_IBM_VERSION)
+    #define JSON_HEDLEY_IBM_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_IBM_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_IBM_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_TI_VERSION)
+    #undef JSON_HEDLEY_TI_VERSION
+#endif
+#if \
+    defined(__TI_COMPILER_VERSION__) && \
+    ( \
+      defined(__TMS470__) || defined(__TI_ARM__) || \
+      defined(__MSP430__) || \
+      defined(__TMS320C2000__) \
+    )
+#if (__TI_COMPILER_VERSION__ >= 16000000)
+    #define JSON_HEDLEY_TI_VERSION JSON_HEDLEY_VERSION_ENCODE(__TI_COMPILER_VERSION__ / 1000000, (__TI_COMPILER_VERSION__ % 1000000) / 1000, (__TI_COMPILER_VERSION__ % 1000))
+#endif
+#endif
+
+#if defined(JSON_HEDLEY_TI_VERSION_CHECK)
+    #undef JSON_HEDLEY_TI_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_TI_VERSION)
+    #define JSON_HEDLEY_TI_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_TI_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_TI_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_TI_CL2000_VERSION)
+    #undef JSON_HEDLEY_TI_CL2000_VERSION
+#endif
+#if defined(__TI_COMPILER_VERSION__) && defined(__TMS320C2000__)
+    #define JSON_HEDLEY_TI_CL2000_VERSION JSON_HEDLEY_VERSION_ENCODE(__TI_COMPILER_VERSION__ / 1000000, (__TI_COMPILER_VERSION__ % 1000000) / 1000, (__TI_COMPILER_VERSION__ % 1000))
+#endif
+
+#if defined(JSON_HEDLEY_TI_CL2000_VERSION_CHECK)
+    #undef JSON_HEDLEY_TI_CL2000_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_TI_CL2000_VERSION)
+    #define JSON_HEDLEY_TI_CL2000_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_TI_CL2000_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_TI_CL2000_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_TI_CL430_VERSION)
+    #undef JSON_HEDLEY_TI_CL430_VERSION
+#endif
+#if defined(__TI_COMPILER_VERSION__) && defined(__MSP430__)
+    #define JSON_HEDLEY_TI_CL430_VERSION JSON_HEDLEY_VERSION_ENCODE(__TI_COMPILER_VERSION__ / 1000000, (__TI_COMPILER_VERSION__ % 1000000) / 1000, (__TI_COMPILER_VERSION__ % 1000))
+#endif
+
+#if defined(JSON_HEDLEY_TI_CL430_VERSION_CHECK)
+    #undef JSON_HEDLEY_TI_CL430_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_TI_CL430_VERSION)
+    #define JSON_HEDLEY_TI_CL430_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_TI_CL430_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_TI_CL430_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_TI_ARMCL_VERSION)
+    #undef JSON_HEDLEY_TI_ARMCL_VERSION
+#endif
+#if defined(__TI_COMPILER_VERSION__) && (defined(__TMS470__) || defined(__TI_ARM__))
+    #define JSON_HEDLEY_TI_ARMCL_VERSION JSON_HEDLEY_VERSION_ENCODE(__TI_COMPILER_VERSION__ / 1000000, (__TI_COMPILER_VERSION__ % 1000000) / 1000, (__TI_COMPILER_VERSION__ % 1000))
+#endif
+
+#if defined(JSON_HEDLEY_TI_ARMCL_VERSION_CHECK)
+    #undef JSON_HEDLEY_TI_ARMCL_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_TI_ARMCL_VERSION)
+    #define JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_TI_ARMCL_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_TI_CL6X_VERSION)
+    #undef JSON_HEDLEY_TI_CL6X_VERSION
+#endif
+#if defined(__TI_COMPILER_VERSION__) && defined(__TMS320C6X__)
+    #define JSON_HEDLEY_TI_CL6X_VERSION JSON_HEDLEY_VERSION_ENCODE(__TI_COMPILER_VERSION__ / 1000000, (__TI_COMPILER_VERSION__ % 1000000) / 1000, (__TI_COMPILER_VERSION__ % 1000))
+#endif
+
+#if defined(JSON_HEDLEY_TI_CL6X_VERSION_CHECK)
+    #undef JSON_HEDLEY_TI_CL6X_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_TI_CL6X_VERSION)
+    #define JSON_HEDLEY_TI_CL6X_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_TI_CL6X_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_TI_CL6X_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_TI_CL7X_VERSION)
+    #undef JSON_HEDLEY_TI_CL7X_VERSION
+#endif
+#if defined(__TI_COMPILER_VERSION__) && defined(__C7000__)
+    #define JSON_HEDLEY_TI_CL7X_VERSION JSON_HEDLEY_VERSION_ENCODE(__TI_COMPILER_VERSION__ / 1000000, (__TI_COMPILER_VERSION__ % 1000000) / 1000, (__TI_COMPILER_VERSION__ % 1000))
+#endif
+
+#if defined(JSON_HEDLEY_TI_CL7X_VERSION_CHECK)
+    #undef JSON_HEDLEY_TI_CL7X_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_TI_CL7X_VERSION)
+    #define JSON_HEDLEY_TI_CL7X_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_TI_CL7X_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_TI_CL7X_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_TI_CLPRU_VERSION)
+    #undef JSON_HEDLEY_TI_CLPRU_VERSION
+#endif
+#if defined(__TI_COMPILER_VERSION__) && defined(__PRU__)
+    #define JSON_HEDLEY_TI_CLPRU_VERSION JSON_HEDLEY_VERSION_ENCODE(__TI_COMPILER_VERSION__ / 1000000, (__TI_COMPILER_VERSION__ % 1000000) / 1000, (__TI_COMPILER_VERSION__ % 1000))
+#endif
+
+#if defined(JSON_HEDLEY_TI_CLPRU_VERSION_CHECK)
+    #undef JSON_HEDLEY_TI_CLPRU_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_TI_CLPRU_VERSION)
+    #define JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_TI_CLPRU_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_CRAY_VERSION)
+    #undef JSON_HEDLEY_CRAY_VERSION
+#endif
+#if defined(_CRAYC)
+    #if defined(_RELEASE_PATCHLEVEL)
+        #define JSON_HEDLEY_CRAY_VERSION JSON_HEDLEY_VERSION_ENCODE(_RELEASE_MAJOR, _RELEASE_MINOR, _RELEASE_PATCHLEVEL)
+    #else
+        #define JSON_HEDLEY_CRAY_VERSION JSON_HEDLEY_VERSION_ENCODE(_RELEASE_MAJOR, _RELEASE_MINOR, 0)
+    #endif
+#endif
+
+#if defined(JSON_HEDLEY_CRAY_VERSION_CHECK)
+    #undef JSON_HEDLEY_CRAY_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_CRAY_VERSION)
+    #define JSON_HEDLEY_CRAY_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_CRAY_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_CRAY_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_IAR_VERSION)
+    #undef JSON_HEDLEY_IAR_VERSION
+#endif
+#if defined(__IAR_SYSTEMS_ICC__)
+    #if __VER__ > 1000
+        #define JSON_HEDLEY_IAR_VERSION JSON_HEDLEY_VERSION_ENCODE((__VER__ / 1000000), ((__VER__ / 1000) % 1000), (__VER__ % 1000))
+    #else
+        #define JSON_HEDLEY_IAR_VERSION JSON_HEDLEY_VERSION_ENCODE(__VER__ / 100, __VER__ % 100, 0)
+    #endif
+#endif
+
+#if defined(JSON_HEDLEY_IAR_VERSION_CHECK)
+    #undef JSON_HEDLEY_IAR_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_IAR_VERSION)
+    #define JSON_HEDLEY_IAR_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_IAR_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_IAR_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_TINYC_VERSION)
+    #undef JSON_HEDLEY_TINYC_VERSION
+#endif
+#if defined(__TINYC__)
+    #define JSON_HEDLEY_TINYC_VERSION JSON_HEDLEY_VERSION_ENCODE(__TINYC__ / 1000, (__TINYC__ / 100) % 10, __TINYC__ % 100)
+#endif
+
+#if defined(JSON_HEDLEY_TINYC_VERSION_CHECK)
+    #undef JSON_HEDLEY_TINYC_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_TINYC_VERSION)
+    #define JSON_HEDLEY_TINYC_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_TINYC_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_TINYC_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_DMC_VERSION)
+    #undef JSON_HEDLEY_DMC_VERSION
+#endif
+#if defined(__DMC__)
+    #define JSON_HEDLEY_DMC_VERSION JSON_HEDLEY_VERSION_ENCODE(__DMC__ >> 8, (__DMC__ >> 4) & 0xf, __DMC__ & 0xf)
+#endif
+
+#if defined(JSON_HEDLEY_DMC_VERSION_CHECK)
+    #undef JSON_HEDLEY_DMC_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_DMC_VERSION)
+    #define JSON_HEDLEY_DMC_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_DMC_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_DMC_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_COMPCERT_VERSION)
+    #undef JSON_HEDLEY_COMPCERT_VERSION
+#endif
+#if defined(__COMPCERT_VERSION__)
+    #define JSON_HEDLEY_COMPCERT_VERSION JSON_HEDLEY_VERSION_ENCODE(__COMPCERT_VERSION__ / 10000, (__COMPCERT_VERSION__ / 100) % 100, __COMPCERT_VERSION__ % 100)
+#endif
+
+#if defined(JSON_HEDLEY_COMPCERT_VERSION_CHECK)
+    #undef JSON_HEDLEY_COMPCERT_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_COMPCERT_VERSION)
+    #define JSON_HEDLEY_COMPCERT_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_COMPCERT_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_COMPCERT_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_PELLES_VERSION)
+    #undef JSON_HEDLEY_PELLES_VERSION
+#endif
+#if defined(__POCC__)
+    #define JSON_HEDLEY_PELLES_VERSION JSON_HEDLEY_VERSION_ENCODE(__POCC__ / 100, __POCC__ % 100, 0)
+#endif
+
+#if defined(JSON_HEDLEY_PELLES_VERSION_CHECK)
+    #undef JSON_HEDLEY_PELLES_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_PELLES_VERSION)
+    #define JSON_HEDLEY_PELLES_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_PELLES_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_PELLES_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_MCST_LCC_VERSION)
+    #undef JSON_HEDLEY_MCST_LCC_VERSION
+#endif
+#if defined(__LCC__) && defined(__LCC_MINOR__)
+    #define JSON_HEDLEY_MCST_LCC_VERSION JSON_HEDLEY_VERSION_ENCODE(__LCC__ / 100, __LCC__ % 100, __LCC_MINOR__)
+#endif
+
+#if defined(JSON_HEDLEY_MCST_LCC_VERSION_CHECK)
+    #undef JSON_HEDLEY_MCST_LCC_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_MCST_LCC_VERSION)
+    #define JSON_HEDLEY_MCST_LCC_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_MCST_LCC_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_MCST_LCC_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_GCC_VERSION)
+    #undef JSON_HEDLEY_GCC_VERSION
+#endif
+#if \
+    defined(JSON_HEDLEY_GNUC_VERSION) && \
+    !defined(__clang__) && \
+    !defined(JSON_HEDLEY_INTEL_VERSION) && \
+    !defined(JSON_HEDLEY_PGI_VERSION) && \
+    !defined(JSON_HEDLEY_ARM_VERSION) && \
+    !defined(JSON_HEDLEY_CRAY_VERSION) && \
+    !defined(JSON_HEDLEY_TI_VERSION) && \
+    !defined(JSON_HEDLEY_TI_ARMCL_VERSION) && \
+    !defined(JSON_HEDLEY_TI_CL430_VERSION) && \
+    !defined(JSON_HEDLEY_TI_CL2000_VERSION) && \
+    !defined(JSON_HEDLEY_TI_CL6X_VERSION) && \
+    !defined(JSON_HEDLEY_TI_CL7X_VERSION) && \
+    !defined(JSON_HEDLEY_TI_CLPRU_VERSION) && \
+    !defined(__COMPCERT__) && \
+    !defined(JSON_HEDLEY_MCST_LCC_VERSION)
+    #define JSON_HEDLEY_GCC_VERSION JSON_HEDLEY_GNUC_VERSION
+#endif
+
+#if defined(JSON_HEDLEY_GCC_VERSION_CHECK)
+    #undef JSON_HEDLEY_GCC_VERSION_CHECK
+#endif
+#if defined(JSON_HEDLEY_GCC_VERSION)
+    #define JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch) (JSON_HEDLEY_GCC_VERSION >= JSON_HEDLEY_VERSION_ENCODE(major, minor, patch))
+#else
+    #define JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch) (0)
+#endif
+
+#if defined(JSON_HEDLEY_HAS_ATTRIBUTE)
+    #undef JSON_HEDLEY_HAS_ATTRIBUTE
+#endif
+#if \
+  defined(__has_attribute) && \
+  ( \
+    (!defined(JSON_HEDLEY_IAR_VERSION) || JSON_HEDLEY_IAR_VERSION_CHECK(8,5,9)) \
+  )
+#  define JSON_HEDLEY_HAS_ATTRIBUTE(attribute) __has_attribute(attribute)
+#else
+#  define JSON_HEDLEY_HAS_ATTRIBUTE(attribute) (0)
+#endif
+
+#if defined(JSON_HEDLEY_GNUC_HAS_ATTRIBUTE)
+    #undef JSON_HEDLEY_GNUC_HAS_ATTRIBUTE
+#endif
+#if defined(__has_attribute)
+    #define JSON_HEDLEY_GNUC_HAS_ATTRIBUTE(attribute,major,minor,patch) JSON_HEDLEY_HAS_ATTRIBUTE(attribute)
+#else
+    #define JSON_HEDLEY_GNUC_HAS_ATTRIBUTE(attribute,major,minor,patch) JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_GCC_HAS_ATTRIBUTE)
+    #undef JSON_HEDLEY_GCC_HAS_ATTRIBUTE
+#endif
+#if defined(__has_attribute)
+    #define JSON_HEDLEY_GCC_HAS_ATTRIBUTE(attribute,major,minor,patch) JSON_HEDLEY_HAS_ATTRIBUTE(attribute)
+#else
+    #define JSON_HEDLEY_GCC_HAS_ATTRIBUTE(attribute,major,minor,patch) JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_HAS_CPP_ATTRIBUTE)
+    #undef JSON_HEDLEY_HAS_CPP_ATTRIBUTE
+#endif
+#if \
+    defined(__has_cpp_attribute) && \
+    defined(__cplusplus) && \
+    (!defined(JSON_HEDLEY_SUNPRO_VERSION) || JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,15,0))
+    #define JSON_HEDLEY_HAS_CPP_ATTRIBUTE(attribute) __has_cpp_attribute(attribute)
+#else
+    #define JSON_HEDLEY_HAS_CPP_ATTRIBUTE(attribute) (0)
+#endif
+
+#if defined(JSON_HEDLEY_HAS_CPP_ATTRIBUTE_NS)
+    #undef JSON_HEDLEY_HAS_CPP_ATTRIBUTE_NS
+#endif
+#if !defined(__cplusplus) || !defined(__has_cpp_attribute)
+    #define JSON_HEDLEY_HAS_CPP_ATTRIBUTE_NS(ns,attribute) (0)
+#elif \
+    !defined(JSON_HEDLEY_PGI_VERSION) && \
+    !defined(JSON_HEDLEY_IAR_VERSION) && \
+    (!defined(JSON_HEDLEY_SUNPRO_VERSION) || JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,15,0)) && \
+    (!defined(JSON_HEDLEY_MSVC_VERSION) || JSON_HEDLEY_MSVC_VERSION_CHECK(19,20,0))
+    #define JSON_HEDLEY_HAS_CPP_ATTRIBUTE_NS(ns,attribute) JSON_HEDLEY_HAS_CPP_ATTRIBUTE(ns::attribute)
+#else
+    #define JSON_HEDLEY_HAS_CPP_ATTRIBUTE_NS(ns,attribute) (0)
+#endif
+
+#if defined(JSON_HEDLEY_GNUC_HAS_CPP_ATTRIBUTE)
+    #undef JSON_HEDLEY_GNUC_HAS_CPP_ATTRIBUTE
+#endif
+#if defined(__has_cpp_attribute) && defined(__cplusplus)
+    #define JSON_HEDLEY_GNUC_HAS_CPP_ATTRIBUTE(attribute,major,minor,patch) __has_cpp_attribute(attribute)
+#else
+    #define JSON_HEDLEY_GNUC_HAS_CPP_ATTRIBUTE(attribute,major,minor,patch) JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_GCC_HAS_CPP_ATTRIBUTE)
+    #undef JSON_HEDLEY_GCC_HAS_CPP_ATTRIBUTE
+#endif
+#if defined(__has_cpp_attribute) && defined(__cplusplus)
+    #define JSON_HEDLEY_GCC_HAS_CPP_ATTRIBUTE(attribute,major,minor,patch) __has_cpp_attribute(attribute)
+#else
+    #define JSON_HEDLEY_GCC_HAS_CPP_ATTRIBUTE(attribute,major,minor,patch) JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_HAS_BUILTIN)
+    #undef JSON_HEDLEY_HAS_BUILTIN
+#endif
+#if defined(__has_builtin)
+    #define JSON_HEDLEY_HAS_BUILTIN(builtin) __has_builtin(builtin)
+#else
+    #define JSON_HEDLEY_HAS_BUILTIN(builtin) (0)
+#endif
+
+#if defined(JSON_HEDLEY_GNUC_HAS_BUILTIN)
+    #undef JSON_HEDLEY_GNUC_HAS_BUILTIN
+#endif
+#if defined(__has_builtin)
+    #define JSON_HEDLEY_GNUC_HAS_BUILTIN(builtin,major,minor,patch) __has_builtin(builtin)
+#else
+    #define JSON_HEDLEY_GNUC_HAS_BUILTIN(builtin,major,minor,patch) JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_GCC_HAS_BUILTIN)
+    #undef JSON_HEDLEY_GCC_HAS_BUILTIN
+#endif
+#if defined(__has_builtin)
+    #define JSON_HEDLEY_GCC_HAS_BUILTIN(builtin,major,minor,patch) __has_builtin(builtin)
+#else
+    #define JSON_HEDLEY_GCC_HAS_BUILTIN(builtin,major,minor,patch) JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_HAS_FEATURE)
+    #undef JSON_HEDLEY_HAS_FEATURE
+#endif
+#if defined(__has_feature)
+    #define JSON_HEDLEY_HAS_FEATURE(feature) __has_feature(feature)
+#else
+    #define JSON_HEDLEY_HAS_FEATURE(feature) (0)
+#endif
+
+#if defined(JSON_HEDLEY_GNUC_HAS_FEATURE)
+    #undef JSON_HEDLEY_GNUC_HAS_FEATURE
+#endif
+#if defined(__has_feature)
+    #define JSON_HEDLEY_GNUC_HAS_FEATURE(feature,major,minor,patch) __has_feature(feature)
+#else
+    #define JSON_HEDLEY_GNUC_HAS_FEATURE(feature,major,minor,patch) JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_GCC_HAS_FEATURE)
+    #undef JSON_HEDLEY_GCC_HAS_FEATURE
+#endif
+#if defined(__has_feature)
+    #define JSON_HEDLEY_GCC_HAS_FEATURE(feature,major,minor,patch) __has_feature(feature)
+#else
+    #define JSON_HEDLEY_GCC_HAS_FEATURE(feature,major,minor,patch) JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_HAS_EXTENSION)
+    #undef JSON_HEDLEY_HAS_EXTENSION
+#endif
+#if defined(__has_extension)
+    #define JSON_HEDLEY_HAS_EXTENSION(extension) __has_extension(extension)
+#else
+    #define JSON_HEDLEY_HAS_EXTENSION(extension) (0)
+#endif
+
+#if defined(JSON_HEDLEY_GNUC_HAS_EXTENSION)
+    #undef JSON_HEDLEY_GNUC_HAS_EXTENSION
+#endif
+#if defined(__has_extension)
+    #define JSON_HEDLEY_GNUC_HAS_EXTENSION(extension,major,minor,patch) __has_extension(extension)
+#else
+    #define JSON_HEDLEY_GNUC_HAS_EXTENSION(extension,major,minor,patch) JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_GCC_HAS_EXTENSION)
+    #undef JSON_HEDLEY_GCC_HAS_EXTENSION
+#endif
+#if defined(__has_extension)
+    #define JSON_HEDLEY_GCC_HAS_EXTENSION(extension,major,minor,patch) __has_extension(extension)
+#else
+    #define JSON_HEDLEY_GCC_HAS_EXTENSION(extension,major,minor,patch) JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_HAS_DECLSPEC_ATTRIBUTE)
+    #undef JSON_HEDLEY_HAS_DECLSPEC_ATTRIBUTE
+#endif
+#if defined(__has_declspec_attribute)
+    #define JSON_HEDLEY_HAS_DECLSPEC_ATTRIBUTE(attribute) __has_declspec_attribute(attribute)
+#else
+    #define JSON_HEDLEY_HAS_DECLSPEC_ATTRIBUTE(attribute) (0)
+#endif
+
+#if defined(JSON_HEDLEY_GNUC_HAS_DECLSPEC_ATTRIBUTE)
+    #undef JSON_HEDLEY_GNUC_HAS_DECLSPEC_ATTRIBUTE
+#endif
+#if defined(__has_declspec_attribute)
+    #define JSON_HEDLEY_GNUC_HAS_DECLSPEC_ATTRIBUTE(attribute,major,minor,patch) __has_declspec_attribute(attribute)
+#else
+    #define JSON_HEDLEY_GNUC_HAS_DECLSPEC_ATTRIBUTE(attribute,major,minor,patch) JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_GCC_HAS_DECLSPEC_ATTRIBUTE)
+    #undef JSON_HEDLEY_GCC_HAS_DECLSPEC_ATTRIBUTE
+#endif
+#if defined(__has_declspec_attribute)
+    #define JSON_HEDLEY_GCC_HAS_DECLSPEC_ATTRIBUTE(attribute,major,minor,patch) __has_declspec_attribute(attribute)
+#else
+    #define JSON_HEDLEY_GCC_HAS_DECLSPEC_ATTRIBUTE(attribute,major,minor,patch) JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_HAS_WARNING)
+    #undef JSON_HEDLEY_HAS_WARNING
+#endif
+#if defined(__has_warning)
+    #define JSON_HEDLEY_HAS_WARNING(warning) __has_warning(warning)
+#else
+    #define JSON_HEDLEY_HAS_WARNING(warning) (0)
+#endif
+
+#if defined(JSON_HEDLEY_GNUC_HAS_WARNING)
+    #undef JSON_HEDLEY_GNUC_HAS_WARNING
+#endif
+#if defined(__has_warning)
+    #define JSON_HEDLEY_GNUC_HAS_WARNING(warning,major,minor,patch) __has_warning(warning)
+#else
+    #define JSON_HEDLEY_GNUC_HAS_WARNING(warning,major,minor,patch) JSON_HEDLEY_GNUC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_GCC_HAS_WARNING)
+    #undef JSON_HEDLEY_GCC_HAS_WARNING
+#endif
+#if defined(__has_warning)
+    #define JSON_HEDLEY_GCC_HAS_WARNING(warning,major,minor,patch) __has_warning(warning)
+#else
+    #define JSON_HEDLEY_GCC_HAS_WARNING(warning,major,minor,patch) JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if \
+    (defined(__STDC_VERSION__) && (__STDC_VERSION__ >= 199901L)) || \
+    defined(__clang__) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,0,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0) || \
+    JSON_HEDLEY_PGI_VERSION_CHECK(18,4,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,7,0) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(2,0,1) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,1,0) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,0,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    JSON_HEDLEY_CRAY_VERSION_CHECK(5,0,0) || \
+    JSON_HEDLEY_TINYC_VERSION_CHECK(0,9,17) || \
+    JSON_HEDLEY_SUNPRO_VERSION_CHECK(8,0,0) || \
+    (JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) && defined(__C99_PRAGMA_OPERATOR))
+    #define JSON_HEDLEY_PRAGMA(value) _Pragma(#value)
+#elif JSON_HEDLEY_MSVC_VERSION_CHECK(15,0,0)
+    #define JSON_HEDLEY_PRAGMA(value) __pragma(value)
+#else
+    #define JSON_HEDLEY_PRAGMA(value)
+#endif
+
+#if defined(JSON_HEDLEY_DIAGNOSTIC_PUSH)
+    #undef JSON_HEDLEY_DIAGNOSTIC_PUSH
+#endif
+#if defined(JSON_HEDLEY_DIAGNOSTIC_POP)
+    #undef JSON_HEDLEY_DIAGNOSTIC_POP
+#endif
+#if defined(__clang__)
+    #define JSON_HEDLEY_DIAGNOSTIC_PUSH _Pragma("clang diagnostic push")
+    #define JSON_HEDLEY_DIAGNOSTIC_POP _Pragma("clang diagnostic pop")
+#elif JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_PUSH _Pragma("warning(push)")
+    #define JSON_HEDLEY_DIAGNOSTIC_POP _Pragma("warning(pop)")
+#elif JSON_HEDLEY_GCC_VERSION_CHECK(4,6,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_PUSH _Pragma("GCC diagnostic push")
+    #define JSON_HEDLEY_DIAGNOSTIC_POP _Pragma("GCC diagnostic pop")
+#elif \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(15,0,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_PUSH __pragma(warning(push))
+    #define JSON_HEDLEY_DIAGNOSTIC_POP __pragma(warning(pop))
+#elif JSON_HEDLEY_ARM_VERSION_CHECK(5,6,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_PUSH _Pragma("push")
+    #define JSON_HEDLEY_DIAGNOSTIC_POP _Pragma("pop")
+#elif \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,4,0) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(8,1,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_PUSH _Pragma("diag_push")
+    #define JSON_HEDLEY_DIAGNOSTIC_POP _Pragma("diag_pop")
+#elif JSON_HEDLEY_PELLES_VERSION_CHECK(2,90,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_PUSH _Pragma("warning(push)")
+    #define JSON_HEDLEY_DIAGNOSTIC_POP _Pragma("warning(pop)")
+#else
+    #define JSON_HEDLEY_DIAGNOSTIC_PUSH
+    #define JSON_HEDLEY_DIAGNOSTIC_POP
+#endif
+
+/* JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_ is for
+   HEDLEY INTERNAL USE ONLY.  API subject to change without notice. */
+#if defined(JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_)
+    #undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_
+#endif
+#if defined(__cplusplus)
+#  if JSON_HEDLEY_HAS_WARNING("-Wc++98-compat")
+#    if JSON_HEDLEY_HAS_WARNING("-Wc++17-extensions")
+#      if JSON_HEDLEY_HAS_WARNING("-Wc++1z-extensions")
+#        define JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_(xpr) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    _Pragma("clang diagnostic ignored \"-Wc++98-compat\"") \
+    _Pragma("clang diagnostic ignored \"-Wc++17-extensions\"") \
+    _Pragma("clang diagnostic ignored \"-Wc++1z-extensions\"") \
+    xpr \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#      else
+#        define JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_(xpr) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    _Pragma("clang diagnostic ignored \"-Wc++98-compat\"") \
+    _Pragma("clang diagnostic ignored \"-Wc++17-extensions\"") \
+    xpr \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#      endif
+#    else
+#      define JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_(xpr) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    _Pragma("clang diagnostic ignored \"-Wc++98-compat\"") \
+    xpr \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#    endif
+#  endif
+#endif
+#if !defined(JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_(x) x
+#endif
+
+#if defined(JSON_HEDLEY_CONST_CAST)
+    #undef JSON_HEDLEY_CONST_CAST
+#endif
+#if defined(__cplusplus)
+#  define JSON_HEDLEY_CONST_CAST(T, expr) (const_cast<T>(expr))
+#elif \
+  JSON_HEDLEY_HAS_WARNING("-Wcast-qual") || \
+  JSON_HEDLEY_GCC_VERSION_CHECK(4,6,0) || \
+  JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0)
+#  define JSON_HEDLEY_CONST_CAST(T, expr) (__extension__ ({ \
+        JSON_HEDLEY_DIAGNOSTIC_PUSH \
+        JSON_HEDLEY_DIAGNOSTIC_DISABLE_CAST_QUAL \
+        ((T) (expr)); \
+        JSON_HEDLEY_DIAGNOSTIC_POP \
+    }))
+#else
+#  define JSON_HEDLEY_CONST_CAST(T, expr) ((T) (expr))
+#endif
+
+#if defined(JSON_HEDLEY_REINTERPRET_CAST)
+    #undef JSON_HEDLEY_REINTERPRET_CAST
+#endif
+#if defined(__cplusplus)
+    #define JSON_HEDLEY_REINTERPRET_CAST(T, expr) (reinterpret_cast<T>(expr))
+#else
+    #define JSON_HEDLEY_REINTERPRET_CAST(T, expr) ((T) (expr))
+#endif
+
+#if defined(JSON_HEDLEY_STATIC_CAST)
+    #undef JSON_HEDLEY_STATIC_CAST
+#endif
+#if defined(__cplusplus)
+    #define JSON_HEDLEY_STATIC_CAST(T, expr) (static_cast<T>(expr))
+#else
+    #define JSON_HEDLEY_STATIC_CAST(T, expr) ((T) (expr))
+#endif
+
+#if defined(JSON_HEDLEY_CPP_CAST)
+    #undef JSON_HEDLEY_CPP_CAST
+#endif
+#if defined(__cplusplus)
+#  if JSON_HEDLEY_HAS_WARNING("-Wold-style-cast")
+#    define JSON_HEDLEY_CPP_CAST(T, expr) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    _Pragma("clang diagnostic ignored \"-Wold-style-cast\"") \
+    ((T) (expr)) \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#  elif JSON_HEDLEY_IAR_VERSION_CHECK(8,3,0)
+#    define JSON_HEDLEY_CPP_CAST(T, expr) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    _Pragma("diag_suppress=Pe137") \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#  else
+#    define JSON_HEDLEY_CPP_CAST(T, expr) ((T) (expr))
+#  endif
+#else
+#  define JSON_HEDLEY_CPP_CAST(T, expr) (expr)
+#endif
+
+#if defined(JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED)
+    #undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED
+#endif
+#if JSON_HEDLEY_HAS_WARNING("-Wdeprecated-declarations")
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("clang diagnostic ignored \"-Wdeprecated-declarations\"")
+#elif JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("warning(disable:1478 1786)")
+#elif JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED __pragma(warning(disable:1478 1786))
+#elif JSON_HEDLEY_PGI_VERSION_CHECK(20,7,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("diag_suppress 1215,1216,1444,1445")
+#elif JSON_HEDLEY_PGI_VERSION_CHECK(17,10,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("diag_suppress 1215,1444")
+#elif JSON_HEDLEY_GCC_VERSION_CHECK(4,3,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("GCC diagnostic ignored \"-Wdeprecated-declarations\"")
+#elif JSON_HEDLEY_MSVC_VERSION_CHECK(15,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED __pragma(warning(disable:4996))
+#elif JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("diag_suppress 1215,1444")
+#elif \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+    (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("diag_suppress 1291,1718")
+#elif JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,13,0) && !defined(__cplusplus)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("error_messages(off,E_DEPRECATED_ATT,E_DEPRECATED_ATT_MESS)")
+#elif JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,13,0) && defined(__cplusplus)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("error_messages(off,symdeprecated,symdeprecated2)")
+#elif JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("diag_suppress=Pe1444,Pe1215")
+#elif JSON_HEDLEY_PELLES_VERSION_CHECK(2,90,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED _Pragma("warn(disable:2241)")
+#else
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED
+#endif
+
+#if defined(JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS)
+    #undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS
+#endif
+#if JSON_HEDLEY_HAS_WARNING("-Wunknown-pragmas")
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS _Pragma("clang diagnostic ignored \"-Wunknown-pragmas\"")
+#elif JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS _Pragma("warning(disable:161)")
+#elif JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS __pragma(warning(disable:161))
+#elif JSON_HEDLEY_PGI_VERSION_CHECK(17,10,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS _Pragma("diag_suppress 1675")
+#elif JSON_HEDLEY_GCC_VERSION_CHECK(4,3,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS _Pragma("GCC diagnostic ignored \"-Wunknown-pragmas\"")
+#elif JSON_HEDLEY_MSVC_VERSION_CHECK(15,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS __pragma(warning(disable:4068))
+#elif \
+    JSON_HEDLEY_TI_VERSION_CHECK(16,9,0) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(8,0,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,3,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS _Pragma("diag_suppress 163")
+#elif JSON_HEDLEY_TI_CL6X_VERSION_CHECK(8,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS _Pragma("diag_suppress 163")
+#elif JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS _Pragma("diag_suppress=Pe161")
+#elif JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS _Pragma("diag_suppress 161")
+#else
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS
+#endif
+
+#if defined(JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES)
+    #undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES
+#endif
+#if JSON_HEDLEY_HAS_WARNING("-Wunknown-attributes")
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("clang diagnostic ignored \"-Wunknown-attributes\"")
+#elif JSON_HEDLEY_GCC_VERSION_CHECK(4,6,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("GCC diagnostic ignored \"-Wdeprecated-declarations\"")
+#elif JSON_HEDLEY_INTEL_VERSION_CHECK(17,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("warning(disable:1292)")
+#elif JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES __pragma(warning(disable:1292))
+#elif JSON_HEDLEY_MSVC_VERSION_CHECK(19,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES __pragma(warning(disable:5030))
+#elif JSON_HEDLEY_PGI_VERSION_CHECK(20,7,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("diag_suppress 1097,1098")
+#elif JSON_HEDLEY_PGI_VERSION_CHECK(17,10,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("diag_suppress 1097")
+#elif JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,14,0) && defined(__cplusplus)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("error_messages(off,attrskipunsup)")
+#elif \
+    JSON_HEDLEY_TI_VERSION_CHECK(18,1,0) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(8,3,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("diag_suppress 1173")
+#elif JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("diag_suppress=Pe1097")
+#elif JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES _Pragma("diag_suppress 1097")
+#else
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES
+#endif
+
+#if defined(JSON_HEDLEY_DIAGNOSTIC_DISABLE_CAST_QUAL)
+    #undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_CAST_QUAL
+#endif
+#if JSON_HEDLEY_HAS_WARNING("-Wcast-qual")
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_CAST_QUAL _Pragma("clang diagnostic ignored \"-Wcast-qual\"")
+#elif JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_CAST_QUAL _Pragma("warning(disable:2203 2331)")
+#elif JSON_HEDLEY_GCC_VERSION_CHECK(3,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_CAST_QUAL _Pragma("GCC diagnostic ignored \"-Wcast-qual\"")
+#else
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_CAST_QUAL
+#endif
+
+#if defined(JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNUSED_FUNCTION)
+    #undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNUSED_FUNCTION
+#endif
+#if JSON_HEDLEY_HAS_WARNING("-Wunused-function")
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNUSED_FUNCTION _Pragma("clang diagnostic ignored \"-Wunused-function\"")
+#elif JSON_HEDLEY_GCC_VERSION_CHECK(3,4,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNUSED_FUNCTION _Pragma("GCC diagnostic ignored \"-Wunused-function\"")
+#elif JSON_HEDLEY_MSVC_VERSION_CHECK(1,0,0)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNUSED_FUNCTION __pragma(warning(disable:4505))
+#elif JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNUSED_FUNCTION _Pragma("diag_suppress 3142")
+#else
+    #define JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNUSED_FUNCTION
+#endif
+
+#if defined(JSON_HEDLEY_DEPRECATED)
+    #undef JSON_HEDLEY_DEPRECATED
+#endif
+#if defined(JSON_HEDLEY_DEPRECATED_FOR)
+    #undef JSON_HEDLEY_DEPRECATED_FOR
+#endif
+#if \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(14,0,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_DEPRECATED(since) __declspec(deprecated("Since " # since))
+    #define JSON_HEDLEY_DEPRECATED_FOR(since, replacement) __declspec(deprecated("Since " #since "; use " #replacement))
+#elif \
+    (JSON_HEDLEY_HAS_EXTENSION(attribute_deprecated_with_message) && !defined(JSON_HEDLEY_IAR_VERSION)) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(4,5,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(5,6,0) || \
+    JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,13,0) || \
+    JSON_HEDLEY_PGI_VERSION_CHECK(17,10,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(18,1,0) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(18,1,0) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(8,3,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,3,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_DEPRECATED(since) __attribute__((__deprecated__("Since " #since)))
+    #define JSON_HEDLEY_DEPRECATED_FOR(since, replacement) __attribute__((__deprecated__("Since " #since "; use " #replacement)))
+#elif defined(__cplusplus) && (__cplusplus >= 201402L)
+    #define JSON_HEDLEY_DEPRECATED(since) JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[deprecated("Since " #since)]])
+    #define JSON_HEDLEY_DEPRECATED_FOR(since, replacement) JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[deprecated("Since " #since "; use " #replacement)]])
+#elif \
+    JSON_HEDLEY_HAS_ATTRIBUTE(deprecated) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,1,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+    (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10) || \
+    JSON_HEDLEY_IAR_VERSION_CHECK(8,10,0)
+    #define JSON_HEDLEY_DEPRECATED(since) __attribute__((__deprecated__))
+    #define JSON_HEDLEY_DEPRECATED_FOR(since, replacement) __attribute__((__deprecated__))
+#elif \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(13,10,0) || \
+    JSON_HEDLEY_PELLES_VERSION_CHECK(6,50,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_DEPRECATED(since) __declspec(deprecated)
+    #define JSON_HEDLEY_DEPRECATED_FOR(since, replacement) __declspec(deprecated)
+#elif JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0)
+    #define JSON_HEDLEY_DEPRECATED(since) _Pragma("deprecated")
+    #define JSON_HEDLEY_DEPRECATED_FOR(since, replacement) _Pragma("deprecated")
+#else
+    #define JSON_HEDLEY_DEPRECATED(since)
+    #define JSON_HEDLEY_DEPRECATED_FOR(since, replacement)
+#endif
+
+#if defined(JSON_HEDLEY_UNAVAILABLE)
+    #undef JSON_HEDLEY_UNAVAILABLE
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(warning) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(4,3,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_UNAVAILABLE(available_since) __attribute__((__warning__("Not available until " #available_since)))
+#else
+    #define JSON_HEDLEY_UNAVAILABLE(available_since)
+#endif
+
+#if defined(JSON_HEDLEY_WARN_UNUSED_RESULT)
+    #undef JSON_HEDLEY_WARN_UNUSED_RESULT
+#endif
+#if defined(JSON_HEDLEY_WARN_UNUSED_RESULT_MSG)
+    #undef JSON_HEDLEY_WARN_UNUSED_RESULT_MSG
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(warn_unused_result) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,4,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+    (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    (JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,15,0) && defined(__cplusplus)) || \
+    JSON_HEDLEY_PGI_VERSION_CHECK(17,10,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT __attribute__((__warn_unused_result__))
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT_MSG(msg) __attribute__((__warn_unused_result__))
+#elif (JSON_HEDLEY_HAS_CPP_ATTRIBUTE(nodiscard) >= 201907L)
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[nodiscard]])
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT_MSG(msg) JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[nodiscard(msg)]])
+#elif JSON_HEDLEY_HAS_CPP_ATTRIBUTE(nodiscard)
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[nodiscard]])
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT_MSG(msg) JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[nodiscard]])
+#elif defined(_Check_return_) /* SAL */
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT _Check_return_
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT_MSG(msg) _Check_return_
+#else
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT
+    #define JSON_HEDLEY_WARN_UNUSED_RESULT_MSG(msg)
+#endif
+
+#if defined(JSON_HEDLEY_SENTINEL)
+    #undef JSON_HEDLEY_SENTINEL
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(sentinel) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(4,0,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(5,4,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_SENTINEL(position) __attribute__((__sentinel__(position)))
+#else
+    #define JSON_HEDLEY_SENTINEL(position)
+#endif
+
+#if defined(JSON_HEDLEY_NO_RETURN)
+    #undef JSON_HEDLEY_NO_RETURN
+#endif
+#if JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0)
+    #define JSON_HEDLEY_NO_RETURN __noreturn
+#elif \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_NO_RETURN __attribute__((__noreturn__))
+#elif defined(__STDC_VERSION__) && __STDC_VERSION__ >= 201112L
+    #define JSON_HEDLEY_NO_RETURN _Noreturn
+#elif defined(__cplusplus) && (__cplusplus >= 201103L)
+    #define JSON_HEDLEY_NO_RETURN JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[noreturn]])
+#elif \
+    JSON_HEDLEY_HAS_ATTRIBUTE(noreturn) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,2,0) || \
+    JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,11,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+    (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    JSON_HEDLEY_IAR_VERSION_CHECK(8,10,0)
+    #define JSON_HEDLEY_NO_RETURN __attribute__((__noreturn__))
+#elif JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,10,0)
+    #define JSON_HEDLEY_NO_RETURN _Pragma("does_not_return")
+#elif \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(13,10,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_NO_RETURN __declspec(noreturn)
+#elif JSON_HEDLEY_TI_CL6X_VERSION_CHECK(6,0,0) && defined(__cplusplus)
+    #define JSON_HEDLEY_NO_RETURN _Pragma("FUNC_NEVER_RETURNS;")
+#elif JSON_HEDLEY_COMPCERT_VERSION_CHECK(3,2,0)
+    #define JSON_HEDLEY_NO_RETURN __attribute((noreturn))
+#elif JSON_HEDLEY_PELLES_VERSION_CHECK(9,0,0)
+    #define JSON_HEDLEY_NO_RETURN __declspec(noreturn)
+#else
+    #define JSON_HEDLEY_NO_RETURN
+#endif
+
+#if defined(JSON_HEDLEY_NO_ESCAPE)
+    #undef JSON_HEDLEY_NO_ESCAPE
+#endif
+#if JSON_HEDLEY_HAS_ATTRIBUTE(noescape)
+    #define JSON_HEDLEY_NO_ESCAPE __attribute__((__noescape__))
+#else
+    #define JSON_HEDLEY_NO_ESCAPE
+#endif
+
+#if defined(JSON_HEDLEY_UNREACHABLE)
+    #undef JSON_HEDLEY_UNREACHABLE
+#endif
+#if defined(JSON_HEDLEY_UNREACHABLE_RETURN)
+    #undef JSON_HEDLEY_UNREACHABLE_RETURN
+#endif
+#if defined(JSON_HEDLEY_ASSUME)
+    #undef JSON_HEDLEY_ASSUME
+#endif
+#if \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(13,10,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_ASSUME(expr) __assume(expr)
+#elif JSON_HEDLEY_HAS_BUILTIN(__builtin_assume)
+    #define JSON_HEDLEY_ASSUME(expr) __builtin_assume(expr)
+#elif \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,2,0) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(4,0,0)
+    #if defined(__cplusplus)
+        #define JSON_HEDLEY_ASSUME(expr) std::_nassert(expr)
+    #else
+        #define JSON_HEDLEY_ASSUME(expr) _nassert(expr)
+    #endif
+#endif
+#if \
+    (JSON_HEDLEY_HAS_BUILTIN(__builtin_unreachable) && (!defined(JSON_HEDLEY_ARM_VERSION))) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(4,5,0) || \
+    JSON_HEDLEY_PGI_VERSION_CHECK(18,10,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(13,1,5) || \
+    JSON_HEDLEY_CRAY_VERSION_CHECK(10,0,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_UNREACHABLE() __builtin_unreachable()
+#elif defined(JSON_HEDLEY_ASSUME)
+    #define JSON_HEDLEY_UNREACHABLE() JSON_HEDLEY_ASSUME(0)
+#endif
+#if !defined(JSON_HEDLEY_ASSUME)
+    #if defined(JSON_HEDLEY_UNREACHABLE)
+        #define JSON_HEDLEY_ASSUME(expr) JSON_HEDLEY_STATIC_CAST(void, ((expr) ? 1 : (JSON_HEDLEY_UNREACHABLE(), 1)))
+    #else
+        #define JSON_HEDLEY_ASSUME(expr) JSON_HEDLEY_STATIC_CAST(void, expr)
+    #endif
+#endif
+#if defined(JSON_HEDLEY_UNREACHABLE)
+    #if  \
+        JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,2,0) || \
+        JSON_HEDLEY_TI_CL6X_VERSION_CHECK(4,0,0)
+        #define JSON_HEDLEY_UNREACHABLE_RETURN(value) return (JSON_HEDLEY_STATIC_CAST(void, JSON_HEDLEY_ASSUME(0)), (value))
+    #else
+        #define JSON_HEDLEY_UNREACHABLE_RETURN(value) JSON_HEDLEY_UNREACHABLE()
+    #endif
+#else
+    #define JSON_HEDLEY_UNREACHABLE_RETURN(value) return (value)
+#endif
+#if !defined(JSON_HEDLEY_UNREACHABLE)
+    #define JSON_HEDLEY_UNREACHABLE() JSON_HEDLEY_ASSUME(0)
+#endif
+
+JSON_HEDLEY_DIAGNOSTIC_PUSH
+#if JSON_HEDLEY_HAS_WARNING("-Wpedantic")
+    #pragma clang diagnostic ignored "-Wpedantic"
+#endif
+#if JSON_HEDLEY_HAS_WARNING("-Wc++98-compat-pedantic") && defined(__cplusplus)
+    #pragma clang diagnostic ignored "-Wc++98-compat-pedantic"
+#endif
+#if JSON_HEDLEY_GCC_HAS_WARNING("-Wvariadic-macros",4,0,0)
+    #if defined(__clang__)
+        #pragma clang diagnostic ignored "-Wvariadic-macros"
+    #elif defined(JSON_HEDLEY_GCC_VERSION)
+        #pragma GCC diagnostic ignored "-Wvariadic-macros"
+    #endif
+#endif
+#if defined(JSON_HEDLEY_NON_NULL)
+    #undef JSON_HEDLEY_NON_NULL
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(nonnull) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,3,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0)
+    #define JSON_HEDLEY_NON_NULL(...) __attribute__((__nonnull__(__VA_ARGS__)))
+#else
+    #define JSON_HEDLEY_NON_NULL(...)
+#endif
+JSON_HEDLEY_DIAGNOSTIC_POP
+
+#if defined(JSON_HEDLEY_PRINTF_FORMAT)
+    #undef JSON_HEDLEY_PRINTF_FORMAT
+#endif
+#if defined(__MINGW32__) && JSON_HEDLEY_GCC_HAS_ATTRIBUTE(format,4,4,0) && !defined(__USE_MINGW_ANSI_STDIO)
+    #define JSON_HEDLEY_PRINTF_FORMAT(string_idx,first_to_check) __attribute__((__format__(ms_printf, string_idx, first_to_check)))
+#elif defined(__MINGW32__) && JSON_HEDLEY_GCC_HAS_ATTRIBUTE(format,4,4,0) && defined(__USE_MINGW_ANSI_STDIO)
+    #define JSON_HEDLEY_PRINTF_FORMAT(string_idx,first_to_check) __attribute__((__format__(gnu_printf, string_idx, first_to_check)))
+#elif \
+    JSON_HEDLEY_HAS_ATTRIBUTE(format) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,1,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(5,6,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+    (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_PRINTF_FORMAT(string_idx,first_to_check) __attribute__((__format__(__printf__, string_idx, first_to_check)))
+#elif JSON_HEDLEY_PELLES_VERSION_CHECK(6,0,0)
+    #define JSON_HEDLEY_PRINTF_FORMAT(string_idx,first_to_check) __declspec(vaformat(printf,string_idx,first_to_check))
+#else
+    #define JSON_HEDLEY_PRINTF_FORMAT(string_idx,first_to_check)
+#endif
+
+#if defined(JSON_HEDLEY_CONSTEXPR)
+    #undef JSON_HEDLEY_CONSTEXPR
+#endif
+#if defined(__cplusplus)
+    #if __cplusplus >= 201103L
+        #define JSON_HEDLEY_CONSTEXPR JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_(constexpr)
+    #endif
+#endif
+#if !defined(JSON_HEDLEY_CONSTEXPR)
+    #define JSON_HEDLEY_CONSTEXPR
+#endif
+
+#if defined(JSON_HEDLEY_PREDICT)
+    #undef JSON_HEDLEY_PREDICT
+#endif
+#if defined(JSON_HEDLEY_LIKELY)
+    #undef JSON_HEDLEY_LIKELY
+#endif
+#if defined(JSON_HEDLEY_UNLIKELY)
+    #undef JSON_HEDLEY_UNLIKELY
+#endif
+#if defined(JSON_HEDLEY_UNPREDICTABLE)
+    #undef JSON_HEDLEY_UNPREDICTABLE
+#endif
+#if JSON_HEDLEY_HAS_BUILTIN(__builtin_unpredictable)
+    #define JSON_HEDLEY_UNPREDICTABLE(expr) __builtin_unpredictable((expr))
+#endif
+#if \
+  (JSON_HEDLEY_HAS_BUILTIN(__builtin_expect_with_probability) && !defined(JSON_HEDLEY_PGI_VERSION)) || \
+  JSON_HEDLEY_GCC_VERSION_CHECK(9,0,0) || \
+  JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+#  define JSON_HEDLEY_PREDICT(expr, value, probability) __builtin_expect_with_probability(  (expr), (value), (probability))
+#  define JSON_HEDLEY_PREDICT_TRUE(expr, probability)   __builtin_expect_with_probability(!!(expr),    1   , (probability))
+#  define JSON_HEDLEY_PREDICT_FALSE(expr, probability)  __builtin_expect_with_probability(!!(expr),    0   , (probability))
+#  define JSON_HEDLEY_LIKELY(expr)                      __builtin_expect                 (!!(expr),    1                  )
+#  define JSON_HEDLEY_UNLIKELY(expr)                    __builtin_expect                 (!!(expr),    0                  )
+#elif \
+  (JSON_HEDLEY_HAS_BUILTIN(__builtin_expect) && !defined(JSON_HEDLEY_INTEL_CL_VERSION)) || \
+  JSON_HEDLEY_GCC_VERSION_CHECK(3,0,0) || \
+  JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+  (JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,15,0) && defined(__cplusplus)) || \
+  JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+  JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) || \
+  JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+  JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,7,0) || \
+  JSON_HEDLEY_TI_CL430_VERSION_CHECK(3,1,0) || \
+  JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,1,0) || \
+  JSON_HEDLEY_TI_CL6X_VERSION_CHECK(6,1,0) || \
+  JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+  JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+  JSON_HEDLEY_TINYC_VERSION_CHECK(0,9,27) || \
+  JSON_HEDLEY_CRAY_VERSION_CHECK(8,1,0) || \
+  JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+#  define JSON_HEDLEY_PREDICT(expr, expected, probability) \
+    (((probability) >= 0.9) ? __builtin_expect((expr), (expected)) : (JSON_HEDLEY_STATIC_CAST(void, expected), (expr)))
+#  define JSON_HEDLEY_PREDICT_TRUE(expr, probability) \
+    (__extension__ ({ \
+        double hedley_probability_ = (probability); \
+        ((hedley_probability_ >= 0.9) ? __builtin_expect(!!(expr), 1) : ((hedley_probability_ <= 0.1) ? __builtin_expect(!!(expr), 0) : !!(expr))); \
+    }))
+#  define JSON_HEDLEY_PREDICT_FALSE(expr, probability) \
+    (__extension__ ({ \
+        double hedley_probability_ = (probability); \
+        ((hedley_probability_ >= 0.9) ? __builtin_expect(!!(expr), 0) : ((hedley_probability_ <= 0.1) ? __builtin_expect(!!(expr), 1) : !!(expr))); \
+    }))
+#  define JSON_HEDLEY_LIKELY(expr)   __builtin_expect(!!(expr), 1)
+#  define JSON_HEDLEY_UNLIKELY(expr) __builtin_expect(!!(expr), 0)
+#else
+#  define JSON_HEDLEY_PREDICT(expr, expected, probability) (JSON_HEDLEY_STATIC_CAST(void, expected), (expr))
+#  define JSON_HEDLEY_PREDICT_TRUE(expr, probability) (!!(expr))
+#  define JSON_HEDLEY_PREDICT_FALSE(expr, probability) (!!(expr))
+#  define JSON_HEDLEY_LIKELY(expr) (!!(expr))
+#  define JSON_HEDLEY_UNLIKELY(expr) (!!(expr))
+#endif
+#if !defined(JSON_HEDLEY_UNPREDICTABLE)
+    #define JSON_HEDLEY_UNPREDICTABLE(expr) JSON_HEDLEY_PREDICT(expr, 1, 0.5)
+#endif
+
+#if defined(JSON_HEDLEY_MALLOC)
+    #undef JSON_HEDLEY_MALLOC
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(malloc) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,1,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,11,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(12,1,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+    (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_MALLOC __attribute__((__malloc__))
+#elif JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,10,0)
+    #define JSON_HEDLEY_MALLOC _Pragma("returns_new_memory")
+#elif \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(14,0,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_MALLOC __declspec(restrict)
+#else
+    #define JSON_HEDLEY_MALLOC
+#endif
+
+#if defined(JSON_HEDLEY_PURE)
+    #undef JSON_HEDLEY_PURE
+#endif
+#if \
+  JSON_HEDLEY_HAS_ATTRIBUTE(pure) || \
+  JSON_HEDLEY_GCC_VERSION_CHECK(2,96,0) || \
+  JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+  JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,11,0) || \
+  JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+  JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) || \
+  JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+  (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+  JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+  (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+  JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+  (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+  JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+  (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+  JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+  JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+  JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+  JSON_HEDLEY_PGI_VERSION_CHECK(17,10,0) || \
+  JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+#  define JSON_HEDLEY_PURE __attribute__((__pure__))
+#elif JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,10,0)
+#  define JSON_HEDLEY_PURE _Pragma("does_not_write_global_data")
+#elif defined(__cplusplus) && \
+    ( \
+      JSON_HEDLEY_TI_CL430_VERSION_CHECK(2,0,1) || \
+      JSON_HEDLEY_TI_CL6X_VERSION_CHECK(4,0,0) || \
+      JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) \
+    )
+#  define JSON_HEDLEY_PURE _Pragma("FUNC_IS_PURE;")
+#else
+#  define JSON_HEDLEY_PURE
+#endif
+
+#if defined(JSON_HEDLEY_CONST)
+    #undef JSON_HEDLEY_CONST
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(const) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(2,5,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,11,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+    (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    JSON_HEDLEY_PGI_VERSION_CHECK(17,10,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_CONST __attribute__((__const__))
+#elif \
+    JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,10,0)
+    #define JSON_HEDLEY_CONST _Pragma("no_side_effect")
+#else
+    #define JSON_HEDLEY_CONST JSON_HEDLEY_PURE
+#endif
+
+#if defined(JSON_HEDLEY_RESTRICT)
+    #undef JSON_HEDLEY_RESTRICT
+#endif
+#if defined(__STDC_VERSION__) && (__STDC_VERSION__ >= 199901L) && !defined(__cplusplus)
+    #define JSON_HEDLEY_RESTRICT restrict
+#elif \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,1,0) || \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(14,0,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) || \
+    JSON_HEDLEY_PGI_VERSION_CHECK(17,10,0) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,2,4) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(8,1,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    (JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,14,0) && defined(__cplusplus)) || \
+    JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0) || \
+    defined(__clang__) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_RESTRICT __restrict
+#elif JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,3,0) && !defined(__cplusplus)
+    #define JSON_HEDLEY_RESTRICT _Restrict
+#else
+    #define JSON_HEDLEY_RESTRICT
+#endif
+
+#if defined(JSON_HEDLEY_INLINE)
+    #undef JSON_HEDLEY_INLINE
+#endif
+#if \
+    (defined(__STDC_VERSION__) && (__STDC_VERSION__ >= 199901L)) || \
+    (defined(__cplusplus) && (__cplusplus >= 199711L))
+    #define JSON_HEDLEY_INLINE inline
+#elif \
+    defined(JSON_HEDLEY_GCC_VERSION) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(6,2,0)
+    #define JSON_HEDLEY_INLINE __inline__
+#elif \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(12,0,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,1,0) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(3,1,0) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,2,0) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(8,0,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_INLINE __inline
+#else
+    #define JSON_HEDLEY_INLINE
+#endif
+
+#if defined(JSON_HEDLEY_ALWAYS_INLINE)
+    #undef JSON_HEDLEY_ALWAYS_INLINE
+#endif
+#if \
+  JSON_HEDLEY_HAS_ATTRIBUTE(always_inline) || \
+  JSON_HEDLEY_GCC_VERSION_CHECK(4,0,0) || \
+  JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+  JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,11,0) || \
+  JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+  JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) || \
+  JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+  (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+  JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+  (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+  JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+  (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+  JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+  (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+  JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+  JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+  JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+  JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10) || \
+  JSON_HEDLEY_IAR_VERSION_CHECK(8,10,0)
+#  define JSON_HEDLEY_ALWAYS_INLINE __attribute__((__always_inline__)) JSON_HEDLEY_INLINE
+#elif \
+  JSON_HEDLEY_MSVC_VERSION_CHECK(12,0,0) || \
+  JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+#  define JSON_HEDLEY_ALWAYS_INLINE __forceinline
+#elif defined(__cplusplus) && \
+    ( \
+      JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+      JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+      JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+      JSON_HEDLEY_TI_CL6X_VERSION_CHECK(6,1,0) || \
+      JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+      JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) \
+    )
+#  define JSON_HEDLEY_ALWAYS_INLINE _Pragma("FUNC_ALWAYS_INLINE;")
+#elif JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0)
+#  define JSON_HEDLEY_ALWAYS_INLINE _Pragma("inline=forced")
+#else
+#  define JSON_HEDLEY_ALWAYS_INLINE JSON_HEDLEY_INLINE
+#endif
+
+#if defined(JSON_HEDLEY_NEVER_INLINE)
+    #undef JSON_HEDLEY_NEVER_INLINE
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(noinline) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(4,0,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,11,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(10,1,0) || \
+    JSON_HEDLEY_TI_VERSION_CHECK(15,12,0) || \
+    (JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(4,8,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_ARMCL_VERSION_CHECK(5,2,0) || \
+    (JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL2000_VERSION_CHECK(6,4,0) || \
+    (JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,0,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL430_VERSION_CHECK(4,3,0) || \
+    (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) || \
+    JSON_HEDLEY_TI_CL7X_VERSION_CHECK(1,2,0) || \
+    JSON_HEDLEY_TI_CLPRU_VERSION_CHECK(2,1,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10) || \
+    JSON_HEDLEY_IAR_VERSION_CHECK(8,10,0)
+    #define JSON_HEDLEY_NEVER_INLINE __attribute__((__noinline__))
+#elif \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(13,10,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_NEVER_INLINE __declspec(noinline)
+#elif JSON_HEDLEY_PGI_VERSION_CHECK(10,2,0)
+    #define JSON_HEDLEY_NEVER_INLINE _Pragma("noinline")
+#elif JSON_HEDLEY_TI_CL6X_VERSION_CHECK(6,0,0) && defined(__cplusplus)
+    #define JSON_HEDLEY_NEVER_INLINE _Pragma("FUNC_CANNOT_INLINE;")
+#elif JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0)
+    #define JSON_HEDLEY_NEVER_INLINE _Pragma("inline=never")
+#elif JSON_HEDLEY_COMPCERT_VERSION_CHECK(3,2,0)
+    #define JSON_HEDLEY_NEVER_INLINE __attribute((noinline))
+#elif JSON_HEDLEY_PELLES_VERSION_CHECK(9,0,0)
+    #define JSON_HEDLEY_NEVER_INLINE __declspec(noinline)
+#else
+    #define JSON_HEDLEY_NEVER_INLINE
+#endif
+
+#if defined(JSON_HEDLEY_PRIVATE)
+    #undef JSON_HEDLEY_PRIVATE
+#endif
+#if defined(JSON_HEDLEY_PUBLIC)
+    #undef JSON_HEDLEY_PUBLIC
+#endif
+#if defined(JSON_HEDLEY_IMPORT)
+    #undef JSON_HEDLEY_IMPORT
+#endif
+#if defined(_WIN32) || defined(__CYGWIN__)
+#  define JSON_HEDLEY_PRIVATE
+#  define JSON_HEDLEY_PUBLIC   __declspec(dllexport)
+#  define JSON_HEDLEY_IMPORT   __declspec(dllimport)
+#else
+#  if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(visibility) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,3,0) || \
+    JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,11,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(13,1,0) || \
+    ( \
+      defined(__TI_EABI__) && \
+      ( \
+        (JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,2,0) && defined(__TI_GNU_ATTRIBUTE_SUPPORT__)) || \
+        JSON_HEDLEY_TI_CL6X_VERSION_CHECK(7,5,0) \
+      ) \
+    ) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+#    define JSON_HEDLEY_PRIVATE __attribute__((__visibility__("hidden")))
+#    define JSON_HEDLEY_PUBLIC  __attribute__((__visibility__("default")))
+#  else
+#    define JSON_HEDLEY_PRIVATE
+#    define JSON_HEDLEY_PUBLIC
+#  endif
+#  define JSON_HEDLEY_IMPORT    extern
+#endif
+
+#if defined(JSON_HEDLEY_NO_THROW)
+    #undef JSON_HEDLEY_NO_THROW
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(nothrow) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,3,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_NO_THROW __attribute__((__nothrow__))
+#elif \
+    JSON_HEDLEY_MSVC_VERSION_CHECK(13,1,0) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0)
+    #define JSON_HEDLEY_NO_THROW __declspec(nothrow)
+#else
+    #define JSON_HEDLEY_NO_THROW
+#endif
+
+#if defined(JSON_HEDLEY_FALL_THROUGH)
+    #undef JSON_HEDLEY_FALL_THROUGH
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(fallthrough) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(7,0,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_FALL_THROUGH __attribute__((__fallthrough__))
+#elif JSON_HEDLEY_HAS_CPP_ATTRIBUTE_NS(clang,fallthrough)
+    #define JSON_HEDLEY_FALL_THROUGH JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[clang::fallthrough]])
+#elif JSON_HEDLEY_HAS_CPP_ATTRIBUTE(fallthrough)
+    #define JSON_HEDLEY_FALL_THROUGH JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_([[fallthrough]])
+#elif defined(__fallthrough) /* SAL */
+    #define JSON_HEDLEY_FALL_THROUGH __fallthrough
+#else
+    #define JSON_HEDLEY_FALL_THROUGH
+#endif
+
+#if defined(JSON_HEDLEY_RETURNS_NON_NULL)
+    #undef JSON_HEDLEY_RETURNS_NON_NULL
+#endif
+#if \
+    JSON_HEDLEY_HAS_ATTRIBUTE(returns_nonnull) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(4,9,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_RETURNS_NON_NULL __attribute__((__returns_nonnull__))
+#elif defined(_Ret_notnull_) /* SAL */
+    #define JSON_HEDLEY_RETURNS_NON_NULL _Ret_notnull_
+#else
+    #define JSON_HEDLEY_RETURNS_NON_NULL
+#endif
+
+#if defined(JSON_HEDLEY_ARRAY_PARAM)
+    #undef JSON_HEDLEY_ARRAY_PARAM
+#endif
+#if \
+    defined(__STDC_VERSION__) && (__STDC_VERSION__ >= 199901L) && \
+    !defined(__STDC_NO_VLA__) && \
+    !defined(__cplusplus) && \
+    !defined(JSON_HEDLEY_PGI_VERSION) && \
+    !defined(JSON_HEDLEY_TINYC_VERSION)
+    #define JSON_HEDLEY_ARRAY_PARAM(name) (name)
+#else
+    #define JSON_HEDLEY_ARRAY_PARAM(name)
+#endif
+
+#if defined(JSON_HEDLEY_IS_CONSTANT)
+    #undef JSON_HEDLEY_IS_CONSTANT
+#endif
+#if defined(JSON_HEDLEY_REQUIRE_CONSTEXPR)
+    #undef JSON_HEDLEY_REQUIRE_CONSTEXPR
+#endif
+/* JSON_HEDLEY_IS_CONSTEXPR_ is for
+   HEDLEY INTERNAL USE ONLY.  API subject to change without notice. */
+#if defined(JSON_HEDLEY_IS_CONSTEXPR_)
+    #undef JSON_HEDLEY_IS_CONSTEXPR_
+#endif
+#if \
+    JSON_HEDLEY_HAS_BUILTIN(__builtin_constant_p) || \
+    JSON_HEDLEY_GCC_VERSION_CHECK(3,4,0) || \
+    JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+    JSON_HEDLEY_TINYC_VERSION_CHECK(0,9,19) || \
+    JSON_HEDLEY_ARM_VERSION_CHECK(4,1,0) || \
+    JSON_HEDLEY_IBM_VERSION_CHECK(13,1,0) || \
+    JSON_HEDLEY_TI_CL6X_VERSION_CHECK(6,1,0) || \
+    (JSON_HEDLEY_SUNPRO_VERSION_CHECK(5,10,0) && !defined(__cplusplus)) || \
+    JSON_HEDLEY_CRAY_VERSION_CHECK(8,1,0) || \
+    JSON_HEDLEY_MCST_LCC_VERSION_CHECK(1,25,10)
+    #define JSON_HEDLEY_IS_CONSTANT(expr) __builtin_constant_p(expr)
+#endif
+#if !defined(__cplusplus)
+#  if \
+       JSON_HEDLEY_HAS_BUILTIN(__builtin_types_compatible_p) || \
+       JSON_HEDLEY_GCC_VERSION_CHECK(3,4,0) || \
+       JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+       JSON_HEDLEY_IBM_VERSION_CHECK(13,1,0) || \
+       JSON_HEDLEY_CRAY_VERSION_CHECK(8,1,0) || \
+       JSON_HEDLEY_ARM_VERSION_CHECK(5,4,0) || \
+       JSON_HEDLEY_TINYC_VERSION_CHECK(0,9,24)
+#if defined(__INTPTR_TYPE__)
+    #define JSON_HEDLEY_IS_CONSTEXPR_(expr) __builtin_types_compatible_p(__typeof__((1 ? (void*) ((__INTPTR_TYPE__) ((expr) * 0)) : (int*) 0)), int*)
+#else
+    #include <stdint.h>
+    #define JSON_HEDLEY_IS_CONSTEXPR_(expr) __builtin_types_compatible_p(__typeof__((1 ? (void*) ((intptr_t) ((expr) * 0)) : (int*) 0)), int*)
+#endif
+#  elif \
+       ( \
+          defined(__STDC_VERSION__) && (__STDC_VERSION__ >= 201112L) && \
+          !defined(JSON_HEDLEY_SUNPRO_VERSION) && \
+          !defined(JSON_HEDLEY_PGI_VERSION) && \
+          !defined(JSON_HEDLEY_IAR_VERSION)) || \
+       (JSON_HEDLEY_HAS_EXTENSION(c_generic_selections) && !defined(JSON_HEDLEY_IAR_VERSION)) || \
+       JSON_HEDLEY_GCC_VERSION_CHECK(4,9,0) || \
+       JSON_HEDLEY_INTEL_VERSION_CHECK(17,0,0) || \
+       JSON_HEDLEY_IBM_VERSION_CHECK(12,1,0) || \
+       JSON_HEDLEY_ARM_VERSION_CHECK(5,3,0)
+#if defined(__INTPTR_TYPE__)
+    #define JSON_HEDLEY_IS_CONSTEXPR_(expr) _Generic((1 ? (void*) ((__INTPTR_TYPE__) ((expr) * 0)) : (int*) 0), int*: 1, void*: 0)
+#else
+    #include <stdint.h>
+    #define JSON_HEDLEY_IS_CONSTEXPR_(expr) _Generic((1 ? (void*) ((intptr_t) * 0) : (int*) 0), int*: 1, void*: 0)
+#endif
+#  elif \
+       defined(JSON_HEDLEY_GCC_VERSION) || \
+       defined(JSON_HEDLEY_INTEL_VERSION) || \
+       defined(JSON_HEDLEY_TINYC_VERSION) || \
+       defined(JSON_HEDLEY_TI_ARMCL_VERSION) || \
+       JSON_HEDLEY_TI_CL430_VERSION_CHECK(18,12,0) || \
+       defined(JSON_HEDLEY_TI_CL2000_VERSION) || \
+       defined(JSON_HEDLEY_TI_CL6X_VERSION) || \
+       defined(JSON_HEDLEY_TI_CL7X_VERSION) || \
+       defined(JSON_HEDLEY_TI_CLPRU_VERSION) || \
+       defined(__clang__)
+#    define JSON_HEDLEY_IS_CONSTEXPR_(expr) ( \
+        sizeof(void) != \
+        sizeof(*( \
+                  1 ? \
+                  ((void*) ((expr) * 0L) ) : \
+((struct { char v[sizeof(void) * 2]; } *) 1) \
+                ) \
+              ) \
+                                            )
+#  endif
+#endif
+#if defined(JSON_HEDLEY_IS_CONSTEXPR_)
+    #if !defined(JSON_HEDLEY_IS_CONSTANT)
+        #define JSON_HEDLEY_IS_CONSTANT(expr) JSON_HEDLEY_IS_CONSTEXPR_(expr)
+    #endif
+    #define JSON_HEDLEY_REQUIRE_CONSTEXPR(expr) (JSON_HEDLEY_IS_CONSTEXPR_(expr) ? (expr) : (-1))
+#else
+    #if !defined(JSON_HEDLEY_IS_CONSTANT)
+        #define JSON_HEDLEY_IS_CONSTANT(expr) (0)
+    #endif
+    #define JSON_HEDLEY_REQUIRE_CONSTEXPR(expr) (expr)
+#endif
+
+#if defined(JSON_HEDLEY_BEGIN_C_DECLS)
+    #undef JSON_HEDLEY_BEGIN_C_DECLS
+#endif
+#if defined(JSON_HEDLEY_END_C_DECLS)
+    #undef JSON_HEDLEY_END_C_DECLS
+#endif
+#if defined(JSON_HEDLEY_C_DECL)
+    #undef JSON_HEDLEY_C_DECL
+#endif
+#if defined(__cplusplus)
+    #define JSON_HEDLEY_BEGIN_C_DECLS extern "C" {
+    #define JSON_HEDLEY_END_C_DECLS }
+    #define JSON_HEDLEY_C_DECL extern "C"
+#else
+    #define JSON_HEDLEY_BEGIN_C_DECLS
+    #define JSON_HEDLEY_END_C_DECLS
+    #define JSON_HEDLEY_C_DECL
+#endif
+
+#if defined(JSON_HEDLEY_STATIC_ASSERT)
+    #undef JSON_HEDLEY_STATIC_ASSERT
+#endif
+#if \
+  !defined(__cplusplus) && ( \
+      (defined(__STDC_VERSION__) && (__STDC_VERSION__ >= 201112L)) || \
+      (JSON_HEDLEY_HAS_FEATURE(c_static_assert) && !defined(JSON_HEDLEY_INTEL_CL_VERSION)) || \
+      JSON_HEDLEY_GCC_VERSION_CHECK(6,0,0) || \
+      JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0) || \
+      defined(_Static_assert) \
+    )
+#  define JSON_HEDLEY_STATIC_ASSERT(expr, message) _Static_assert(expr, message)
+#elif \
+  (defined(__cplusplus) && (__cplusplus >= 201103L)) || \
+  JSON_HEDLEY_MSVC_VERSION_CHECK(16,0,0) || \
+  JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+#  define JSON_HEDLEY_STATIC_ASSERT(expr, message) JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_(static_assert(expr, message))
+#else
+#  define JSON_HEDLEY_STATIC_ASSERT(expr, message)
+#endif
+
+#if defined(JSON_HEDLEY_NULL)
+    #undef JSON_HEDLEY_NULL
+#endif
+#if defined(__cplusplus)
+    #if __cplusplus >= 201103L
+        #define JSON_HEDLEY_NULL JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_(nullptr)
+    #elif defined(NULL)
+        #define JSON_HEDLEY_NULL NULL
+    #else
+        #define JSON_HEDLEY_NULL JSON_HEDLEY_STATIC_CAST(void*, 0)
+    #endif
+#elif defined(NULL)
+    #define JSON_HEDLEY_NULL NULL
+#else
+    #define JSON_HEDLEY_NULL ((void*) 0)
+#endif
+
+#if defined(JSON_HEDLEY_MESSAGE)
+    #undef JSON_HEDLEY_MESSAGE
+#endif
+#if JSON_HEDLEY_HAS_WARNING("-Wunknown-pragmas")
+#  define JSON_HEDLEY_MESSAGE(msg) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS \
+    JSON_HEDLEY_PRAGMA(message msg) \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#elif \
+  JSON_HEDLEY_GCC_VERSION_CHECK(4,4,0) || \
+  JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0)
+#  define JSON_HEDLEY_MESSAGE(msg) JSON_HEDLEY_PRAGMA(message msg)
+#elif JSON_HEDLEY_CRAY_VERSION_CHECK(5,0,0)
+#  define JSON_HEDLEY_MESSAGE(msg) JSON_HEDLEY_PRAGMA(_CRI message msg)
+#elif JSON_HEDLEY_IAR_VERSION_CHECK(8,0,0)
+#  define JSON_HEDLEY_MESSAGE(msg) JSON_HEDLEY_PRAGMA(message(msg))
+#elif JSON_HEDLEY_PELLES_VERSION_CHECK(2,0,0)
+#  define JSON_HEDLEY_MESSAGE(msg) JSON_HEDLEY_PRAGMA(message(msg))
+#else
+#  define JSON_HEDLEY_MESSAGE(msg)
+#endif
+
+#if defined(JSON_HEDLEY_WARNING)
+    #undef JSON_HEDLEY_WARNING
+#endif
+#if JSON_HEDLEY_HAS_WARNING("-Wunknown-pragmas")
+#  define JSON_HEDLEY_WARNING(msg) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS \
+    JSON_HEDLEY_PRAGMA(clang warning msg) \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#elif \
+  JSON_HEDLEY_GCC_VERSION_CHECK(4,8,0) || \
+  JSON_HEDLEY_PGI_VERSION_CHECK(18,4,0) || \
+  JSON_HEDLEY_INTEL_VERSION_CHECK(13,0,0)
+#  define JSON_HEDLEY_WARNING(msg) JSON_HEDLEY_PRAGMA(GCC warning msg)
+#elif \
+  JSON_HEDLEY_MSVC_VERSION_CHECK(15,0,0) || \
+  JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+#  define JSON_HEDLEY_WARNING(msg) JSON_HEDLEY_PRAGMA(message(msg))
+#else
+#  define JSON_HEDLEY_WARNING(msg) JSON_HEDLEY_MESSAGE(msg)
+#endif
+
+#if defined(JSON_HEDLEY_REQUIRE)
+    #undef JSON_HEDLEY_REQUIRE
+#endif
+#if defined(JSON_HEDLEY_REQUIRE_MSG)
+    #undef JSON_HEDLEY_REQUIRE_MSG
+#endif
+#if JSON_HEDLEY_HAS_ATTRIBUTE(diagnose_if)
+#  if JSON_HEDLEY_HAS_WARNING("-Wgcc-compat")
+#    define JSON_HEDLEY_REQUIRE(expr) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    _Pragma("clang diagnostic ignored \"-Wgcc-compat\"") \
+    __attribute__((diagnose_if(!(expr), #expr, "error"))) \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#    define JSON_HEDLEY_REQUIRE_MSG(expr,msg) \
+    JSON_HEDLEY_DIAGNOSTIC_PUSH \
+    _Pragma("clang diagnostic ignored \"-Wgcc-compat\"") \
+    __attribute__((diagnose_if(!(expr), msg, "error"))) \
+    JSON_HEDLEY_DIAGNOSTIC_POP
+#  else
+#    define JSON_HEDLEY_REQUIRE(expr) __attribute__((diagnose_if(!(expr), #expr, "error")))
+#    define JSON_HEDLEY_REQUIRE_MSG(expr,msg) __attribute__((diagnose_if(!(expr), msg, "error")))
+#  endif
+#else
+#  define JSON_HEDLEY_REQUIRE(expr)
+#  define JSON_HEDLEY_REQUIRE_MSG(expr,msg)
+#endif
+
+#if defined(JSON_HEDLEY_FLAGS)
+    #undef JSON_HEDLEY_FLAGS
+#endif
+#if JSON_HEDLEY_HAS_ATTRIBUTE(flag_enum) && (!defined(__cplusplus) || JSON_HEDLEY_HAS_WARNING("-Wbitfield-enum-conversion"))
+    #define JSON_HEDLEY_FLAGS __attribute__((__flag_enum__))
+#else
+    #define JSON_HEDLEY_FLAGS
+#endif
+
+#if defined(JSON_HEDLEY_FLAGS_CAST)
+    #undef JSON_HEDLEY_FLAGS_CAST
+#endif
+#if JSON_HEDLEY_INTEL_VERSION_CHECK(19,0,0)
+#  define JSON_HEDLEY_FLAGS_CAST(T, expr) (__extension__ ({ \
+        JSON_HEDLEY_DIAGNOSTIC_PUSH \
+        _Pragma("warning(disable:188)") \
+        ((T) (expr)); \
+        JSON_HEDLEY_DIAGNOSTIC_POP \
+    }))
+#else
+#  define JSON_HEDLEY_FLAGS_CAST(T, expr) JSON_HEDLEY_STATIC_CAST(T, expr)
+#endif
+
+#if defined(JSON_HEDLEY_EMPTY_BASES)
+    #undef JSON_HEDLEY_EMPTY_BASES
+#endif
+#if \
+    (JSON_HEDLEY_MSVC_VERSION_CHECK(19,0,23918) && !JSON_HEDLEY_MSVC_VERSION_CHECK(20,0,0)) || \
+    JSON_HEDLEY_INTEL_CL_VERSION_CHECK(2021,1,0)
+    #define JSON_HEDLEY_EMPTY_BASES __declspec(empty_bases)
+#else
+    #define JSON_HEDLEY_EMPTY_BASES
+#endif
+
+/* Remaining macros are deprecated. */
+
+#if defined(JSON_HEDLEY_GCC_NOT_CLANG_VERSION_CHECK)
+    #undef JSON_HEDLEY_GCC_NOT_CLANG_VERSION_CHECK
+#endif
+#if defined(__clang__)
+    #define JSON_HEDLEY_GCC_NOT_CLANG_VERSION_CHECK(major,minor,patch) (0)
+#else
+    #define JSON_HEDLEY_GCC_NOT_CLANG_VERSION_CHECK(major,minor,patch) JSON_HEDLEY_GCC_VERSION_CHECK(major,minor,patch)
+#endif
+
+#if defined(JSON_HEDLEY_CLANG_HAS_ATTRIBUTE)
+    #undef JSON_HEDLEY_CLANG_HAS_ATTRIBUTE
+#endif
+#define JSON_HEDLEY_CLANG_HAS_ATTRIBUTE(attribute) JSON_HEDLEY_HAS_ATTRIBUTE(attribute)
+
+#if defined(JSON_HEDLEY_CLANG_HAS_CPP_ATTRIBUTE)
+    #undef JSON_HEDLEY_CLANG_HAS_CPP_ATTRIBUTE
+#endif
+#define JSON_HEDLEY_CLANG_HAS_CPP_ATTRIBUTE(attribute) JSON_HEDLEY_HAS_CPP_ATTRIBUTE(attribute)
+
+#if defined(JSON_HEDLEY_CLANG_HAS_BUILTIN)
+    #undef JSON_HEDLEY_CLANG_HAS_BUILTIN
+#endif
+#define JSON_HEDLEY_CLANG_HAS_BUILTIN(builtin) JSON_HEDLEY_HAS_BUILTIN(builtin)
+
+#if defined(JSON_HEDLEY_CLANG_HAS_FEATURE)
+    #undef JSON_HEDLEY_CLANG_HAS_FEATURE
+#endif
+#define JSON_HEDLEY_CLANG_HAS_FEATURE(feature) JSON_HEDLEY_HAS_FEATURE(feature)
+
+#if defined(JSON_HEDLEY_CLANG_HAS_EXTENSION)
+    #undef JSON_HEDLEY_CLANG_HAS_EXTENSION
+#endif
+#define JSON_HEDLEY_CLANG_HAS_EXTENSION(extension) JSON_HEDLEY_HAS_EXTENSION(extension)
+
+#if defined(JSON_HEDLEY_CLANG_HAS_DECLSPEC_DECLSPEC_ATTRIBUTE)
+    #undef JSON_HEDLEY_CLANG_HAS_DECLSPEC_DECLSPEC_ATTRIBUTE
+#endif
+#define JSON_HEDLEY_CLANG_HAS_DECLSPEC_ATTRIBUTE(attribute) JSON_HEDLEY_HAS_DECLSPEC_ATTRIBUTE(attribute)
+
+#if defined(JSON_HEDLEY_CLANG_HAS_WARNING)
+    #undef JSON_HEDLEY_CLANG_HAS_WARNING
+#endif
+#define JSON_HEDLEY_CLANG_HAS_WARNING(warning) JSON_HEDLEY_HAS_WARNING(warning)
+
+#endif /* !defined(JSON_HEDLEY_VERSION) || (JSON_HEDLEY_VERSION < X) */
+
+// #include <nlohmann/detail/meta/detected.hpp>
+
+
+#include <type_traits>
+
+// #include <nlohmann/detail/meta/void_t.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+template<typename ...Ts> struct make_void
+{
+    using type = void;
+};
+template<typename ...Ts> using void_t = typename make_void<Ts...>::type;
+} // namespace detail
+}  // namespace nlohmann
+
+
+// https://en.cppreference.com/w/cpp/experimental/is_detected
+namespace nlohmann
+{
+namespace detail
+{
+struct nonesuch
+{
+    nonesuch() = delete;
+    ~nonesuch() = delete;
+    nonesuch(nonesuch const&) = delete;
+    nonesuch(nonesuch const&&) = delete;
+    void operator=(nonesuch const&) = delete;
+    void operator=(nonesuch&&) = delete;
+};
+
+template<class Default,
+         class AlwaysVoid,
+         template<class...> class Op,
+         class... Args>
+struct detector
+{
+    using value_t = std::false_type;
+    using type = Default;
+};
+
+template<class Default, template<class...> class Op, class... Args>
+struct detector<Default, void_t<Op<Args...>>, Op, Args...>
+{
+    using value_t = std::true_type;
+    using type = Op<Args...>;
+};
+
+template<template<class...> class Op, class... Args>
+using is_detected = typename detector<nonesuch, void, Op, Args...>::value_t;
+
+template<template<class...> class Op, class... Args>
+struct is_detected_lazy : is_detected<Op, Args...> { };
+
+template<template<class...> class Op, class... Args>
+using detected_t = typename detector<nonesuch, void, Op, Args...>::type;
+
+template<class Default, template<class...> class Op, class... Args>
+using detected_or = detector<Default, void, Op, Args...>;
+
+template<class Default, template<class...> class Op, class... Args>
+using detected_or_t = typename detected_or<Default, Op, Args...>::type;
+
+template<class Expected, template<class...> class Op, class... Args>
+using is_detected_exact = std::is_same<Expected, detected_t<Op, Args...>>;
+
+template<class To, template<class...> class Op, class... Args>
+using is_detected_convertible =
+    std::is_convertible<detected_t<Op, Args...>, To>;
+}  // namespace detail
+}  // namespace nlohmann
+
+
+// This file contains all internal macro definitions
+// You MUST include macro_unscope.hpp at the end of json.hpp to undef all of them
+
+// exclude unsupported compilers
+#if !defined(JSON_SKIP_UNSUPPORTED_COMPILER_CHECK)
+    #if defined(__clang__)
+        #if (__clang_major__ * 10000 + __clang_minor__ * 100 + __clang_patchlevel__) < 30400
+            #error "unsupported Clang version - see https://github.com/nlohmann/json#supported-compilers"
+        #endif
+    #elif defined(__GNUC__) && !(defined(__ICC) || defined(__INTEL_COMPILER))
+        #if (__GNUC__ * 10000 + __GNUC_MINOR__ * 100 + __GNUC_PATCHLEVEL__) < 40800
+            #error "unsupported GCC version - see https://github.com/nlohmann/json#supported-compilers"
+        #endif
+    #endif
+#endif
+
+// C++ language standard detection
+// if the user manually specified the used c++ version this is skipped
+#if !defined(JSON_HAS_CPP_20) && !defined(JSON_HAS_CPP_17) && !defined(JSON_HAS_CPP_14) && !defined(JSON_HAS_CPP_11)
+    #if (defined(__cplusplus) && __cplusplus >= 202002L) || (defined(_MSVC_LANG) && _MSVC_LANG >= 202002L)
+        #define JSON_HAS_CPP_20
+        #define JSON_HAS_CPP_17
+        #define JSON_HAS_CPP_14
+    #elif (defined(__cplusplus) && __cplusplus >= 201703L) || (defined(_HAS_CXX17) && _HAS_CXX17 == 1) // fix for issue #464
+        #define JSON_HAS_CPP_17
+        #define JSON_HAS_CPP_14
+    #elif (defined(__cplusplus) && __cplusplus >= 201402L) || (defined(_HAS_CXX14) && _HAS_CXX14 == 1)
+        #define JSON_HAS_CPP_14
+    #endif
+    // the cpp 11 flag is always specified because it is the minimal required version
+    #define JSON_HAS_CPP_11
+#endif
+
+#if !defined(JSON_HAS_FILESYSTEM) && !defined(JSON_HAS_EXPERIMENTAL_FILESYSTEM)
+    #ifdef JSON_HAS_CPP_17
+        #if defined(__cpp_lib_filesystem)
+            #define JSON_HAS_FILESYSTEM 1
+        #elif defined(__cpp_lib_experimental_filesystem)
+            #define JSON_HAS_EXPERIMENTAL_FILESYSTEM 1
+        #elif !defined(__has_include)
+            #define JSON_HAS_EXPERIMENTAL_FILESYSTEM 1
+        #elif __has_include(<filesystem>)
+            #define JSON_HAS_FILESYSTEM 1
+        #elif __has_include(<experimental/filesystem>)
+            #define JSON_HAS_EXPERIMENTAL_FILESYSTEM 1
+        #endif
+
+        // std::filesystem does not work on MinGW GCC 8: https://sourceforge.net/p/mingw-w64/bugs/737/
+        #if defined(__MINGW32__) && defined(__GNUC__) && __GNUC__ == 8
+            #undef JSON_HAS_FILESYSTEM
+            #undef JSON_HAS_EXPERIMENTAL_FILESYSTEM
+        #endif
+
+        // no filesystem support before GCC 8: https://en.cppreference.com/w/cpp/compiler_support
+        #if defined(__GNUC__) && !defined(__clang__) && __GNUC__ < 8
+            #undef JSON_HAS_FILESYSTEM
+            #undef JSON_HAS_EXPERIMENTAL_FILESYSTEM
+        #endif
+
+        // no filesystem support before Clang 7: https://en.cppreference.com/w/cpp/compiler_support
+        #if defined(__clang_major__) && __clang_major__ < 7
+            #undef JSON_HAS_FILESYSTEM
+            #undef JSON_HAS_EXPERIMENTAL_FILESYSTEM
+        #endif
+
+        // no filesystem support before MSVC 19.14: https://en.cppreference.com/w/cpp/compiler_support
+        #if defined(_MSC_VER) && _MSC_VER < 1940
+            #undef JSON_HAS_FILESYSTEM
+            #undef JSON_HAS_EXPERIMENTAL_FILESYSTEM
+        #endif
+
+        // no filesystem support before iOS 13
+        #if defined(__IPHONE_OS_VERSION_MIN_REQUIRED) && __IPHONE_OS_VERSION_MIN_REQUIRED < 130000
+            #undef JSON_HAS_FILESYSTEM
+            #undef JSON_HAS_EXPERIMENTAL_FILESYSTEM
+        #endif
+
+        // no filesystem support before macOS Catalina
+        #if defined(__MAC_OS_X_VERSION_MIN_REQUIRED) && __MAC_OS_X_VERSION_MIN_REQUIRED < 101500
+            #undef JSON_HAS_FILESYSTEM
+            #undef JSON_HAS_EXPERIMENTAL_FILESYSTEM
+        #endif
+    #endif
+#endif
+
+#ifndef JSON_HAS_EXPERIMENTAL_FILESYSTEM
+    #define JSON_HAS_EXPERIMENTAL_FILESYSTEM 0
+#endif
+
+#ifndef JSON_HAS_FILESYSTEM
+    #define JSON_HAS_FILESYSTEM 0
+#endif
+
+// disable documentation warnings on clang
+#if defined(__clang__)
+    #pragma clang diagnostic push
+    #pragma clang diagnostic ignored "-Wdocumentation"
+    #pragma clang diagnostic ignored "-Wdocumentation-unknown-command"
+#endif
+
+// allow disabling exceptions
+#if (defined(__cpp_exceptions) || defined(__EXCEPTIONS) || defined(_CPPUNWIND)) && !defined(JSON_NOEXCEPTION)
+    #define JSON_THROW(exception) throw exception
+    #define JSON_TRY try
+    #define JSON_CATCH(exception) catch(exception)
+    #define JSON_INTERNAL_CATCH(exception) catch(exception)
+#else
+    #include <cstdlib>
+    #define JSON_THROW(exception) std::abort()
+    #define JSON_TRY if(true)
+    #define JSON_CATCH(exception) if(false)
+    #define JSON_INTERNAL_CATCH(exception) if(false)
+#endif
+
+// override exception macros
+#if defined(JSON_THROW_USER)
+    #undef JSON_THROW
+    #define JSON_THROW JSON_THROW_USER
+#endif
+#if defined(JSON_TRY_USER)
+    #undef JSON_TRY
+    #define JSON_TRY JSON_TRY_USER
+#endif
+#if defined(JSON_CATCH_USER)
+    #undef JSON_CATCH
+    #define JSON_CATCH JSON_CATCH_USER
+    #undef JSON_INTERNAL_CATCH
+    #define JSON_INTERNAL_CATCH JSON_CATCH_USER
+#endif
+#if defined(JSON_INTERNAL_CATCH_USER)
+    #undef JSON_INTERNAL_CATCH
+    #define JSON_INTERNAL_CATCH JSON_INTERNAL_CATCH_USER
+#endif
+
+// allow overriding assert
+#if !defined(JSON_ASSERT)
+    #include <cassert> // assert
+    #define JSON_ASSERT(x) assert(x)
+#endif
+
+// allow to access some private functions (needed by the test suite)
+#if defined(JSON_TESTS_PRIVATE)
+    #define JSON_PRIVATE_UNLESS_TESTED public
+#else
+    #define JSON_PRIVATE_UNLESS_TESTED private
+#endif
+
+/*!
+@brief macro to briefly define a mapping between an enum and JSON
+@def NLOHMANN_JSON_SERIALIZE_ENUM
+@since version 3.4.0
+*/
+#define NLOHMANN_JSON_SERIALIZE_ENUM(ENUM_TYPE, ...)                                            \
+    template<typename BasicJsonType>                                                            \
+    inline void to_json(BasicJsonType& j, const ENUM_TYPE& e)                                   \
+    {                                                                                           \
+        static_assert(std::is_enum<ENUM_TYPE>::value, #ENUM_TYPE " must be an enum!");          \
+        static const std::pair<ENUM_TYPE, BasicJsonType> m[] = __VA_ARGS__;                     \
+        auto it = std::find_if(std::begin(m), std::end(m),                                      \
+                               [e](const std::pair<ENUM_TYPE, BasicJsonType>& ej_pair) -> bool  \
+        {                                                                                       \
+            return ej_pair.first == e;                                                          \
+        });                                                                                     \
+        j = ((it != std::end(m)) ? it : std::begin(m))->second;                                 \
+    }                                                                                           \
+    template<typename BasicJsonType>                                                            \
+    inline void from_json(const BasicJsonType& j, ENUM_TYPE& e)                                 \
+    {                                                                                           \
+        static_assert(std::is_enum<ENUM_TYPE>::value, #ENUM_TYPE " must be an enum!");          \
+        static const std::pair<ENUM_TYPE, BasicJsonType> m[] = __VA_ARGS__;                     \
+        auto it = std::find_if(std::begin(m), std::end(m),                                      \
+                               [&j](const std::pair<ENUM_TYPE, BasicJsonType>& ej_pair) -> bool \
+        {                                                                                       \
+            return ej_pair.second == j;                                                         \
+        });                                                                                     \
+        e = ((it != std::end(m)) ? it : std::begin(m))->first;                                  \
+    }
+
+// Ugly macros to avoid uglier copy-paste when specializing basic_json. They
+// may be removed in the future once the class is split.
+
+#define NLOHMANN_BASIC_JSON_TPL_DECLARATION                                \
+    template<template<typename, typename, typename...> class ObjectType,   \
+             template<typename, typename...> class ArrayType,              \
+             class StringType, class BooleanType, class NumberIntegerType, \
+             class NumberUnsignedType, class NumberFloatType,              \
+             template<typename> class AllocatorType,                       \
+             template<typename, typename = void> class JSONSerializer,     \
+             class BinaryType>
+
+#define NLOHMANN_BASIC_JSON_TPL                                            \
+    basic_json<ObjectType, ArrayType, StringType, BooleanType,             \
+    NumberIntegerType, NumberUnsignedType, NumberFloatType,                \
+    AllocatorType, JSONSerializer, BinaryType>
+
+// Macros to simplify conversion from/to types
+
+#define NLOHMANN_JSON_EXPAND( x ) x
+#define NLOHMANN_JSON_GET_MACRO(_1, _2, _3, _4, _5, _6, _7, _8, _9, _10, _11, _12, _13, _14, _15, _16, _17, _18, _19, _20, _21, _22, _23, _24, _25, _26, _27, _28, _29, _30, _31, _32, _33, _34, _35, _36, _37, _38, _39, _40, _41, _42, _43, _44, _45, _46, _47, _48, _49, _50, _51, _52, _53, _54, _55, _56, _57, _58, _59, _60, _61, _62, _63, _64, NAME,...) NAME
+#define NLOHMANN_JSON_PASTE(...) NLOHMANN_JSON_EXPAND(NLOHMANN_JSON_GET_MACRO(__VA_ARGS__, \
+        NLOHMANN_JSON_PASTE64, \
+        NLOHMANN_JSON_PASTE63, \
+        NLOHMANN_JSON_PASTE62, \
+        NLOHMANN_JSON_PASTE61, \
+        NLOHMANN_JSON_PASTE60, \
+        NLOHMANN_JSON_PASTE59, \
+        NLOHMANN_JSON_PASTE58, \
+        NLOHMANN_JSON_PASTE57, \
+        NLOHMANN_JSON_PASTE56, \
+        NLOHMANN_JSON_PASTE55, \
+        NLOHMANN_JSON_PASTE54, \
+        NLOHMANN_JSON_PASTE53, \
+        NLOHMANN_JSON_PASTE52, \
+        NLOHMANN_JSON_PASTE51, \
+        NLOHMANN_JSON_PASTE50, \
+        NLOHMANN_JSON_PASTE49, \
+        NLOHMANN_JSON_PASTE48, \
+        NLOHMANN_JSON_PASTE47, \
+        NLOHMANN_JSON_PASTE46, \
+        NLOHMANN_JSON_PASTE45, \
+        NLOHMANN_JSON_PASTE44, \
+        NLOHMANN_JSON_PASTE43, \
+        NLOHMANN_JSON_PASTE42, \
+        NLOHMANN_JSON_PASTE41, \
+        NLOHMANN_JSON_PASTE40, \
+        NLOHMANN_JSON_PASTE39, \
+        NLOHMANN_JSON_PASTE38, \
+        NLOHMANN_JSON_PASTE37, \
+        NLOHMANN_JSON_PASTE36, \
+        NLOHMANN_JSON_PASTE35, \
+        NLOHMANN_JSON_PASTE34, \
+        NLOHMANN_JSON_PASTE33, \
+        NLOHMANN_JSON_PASTE32, \
+        NLOHMANN_JSON_PASTE31, \
+        NLOHMANN_JSON_PASTE30, \
+        NLOHMANN_JSON_PASTE29, \
+        NLOHMANN_JSON_PASTE28, \
+        NLOHMANN_JSON_PASTE27, \
+        NLOHMANN_JSON_PASTE26, \
+        NLOHMANN_JSON_PASTE25, \
+        NLOHMANN_JSON_PASTE24, \
+        NLOHMANN_JSON_PASTE23, \
+        NLOHMANN_JSON_PASTE22, \
+        NLOHMANN_JSON_PASTE21, \
+        NLOHMANN_JSON_PASTE20, \
+        NLOHMANN_JSON_PASTE19, \
+        NLOHMANN_JSON_PASTE18, \
+        NLOHMANN_JSON_PASTE17, \
+        NLOHMANN_JSON_PASTE16, \
+        NLOHMANN_JSON_PASTE15, \
+        NLOHMANN_JSON_PASTE14, \
+        NLOHMANN_JSON_PASTE13, \
+        NLOHMANN_JSON_PASTE12, \
+        NLOHMANN_JSON_PASTE11, \
+        NLOHMANN_JSON_PASTE10, \
+        NLOHMANN_JSON_PASTE9, \
+        NLOHMANN_JSON_PASTE8, \
+        NLOHMANN_JSON_PASTE7, \
+        NLOHMANN_JSON_PASTE6, \
+        NLOHMANN_JSON_PASTE5, \
+        NLOHMANN_JSON_PASTE4, \
+        NLOHMANN_JSON_PASTE3, \
+        NLOHMANN_JSON_PASTE2, \
+        NLOHMANN_JSON_PASTE1)(__VA_ARGS__))
+#define NLOHMANN_JSON_PASTE2(func, v1) func(v1)
+#define NLOHMANN_JSON_PASTE3(func, v1, v2) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE2(func, v2)
+#define NLOHMANN_JSON_PASTE4(func, v1, v2, v3) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE3(func, v2, v3)
+#define NLOHMANN_JSON_PASTE5(func, v1, v2, v3, v4) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE4(func, v2, v3, v4)
+#define NLOHMANN_JSON_PASTE6(func, v1, v2, v3, v4, v5) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE5(func, v2, v3, v4, v5)
+#define NLOHMANN_JSON_PASTE7(func, v1, v2, v3, v4, v5, v6) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE6(func, v2, v3, v4, v5, v6)
+#define NLOHMANN_JSON_PASTE8(func, v1, v2, v3, v4, v5, v6, v7) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE7(func, v2, v3, v4, v5, v6, v7)
+#define NLOHMANN_JSON_PASTE9(func, v1, v2, v3, v4, v5, v6, v7, v8) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE8(func, v2, v3, v4, v5, v6, v7, v8)
+#define NLOHMANN_JSON_PASTE10(func, v1, v2, v3, v4, v5, v6, v7, v8, v9) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE9(func, v2, v3, v4, v5, v6, v7, v8, v9)
+#define NLOHMANN_JSON_PASTE11(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE10(func, v2, v3, v4, v5, v6, v7, v8, v9, v10)
+#define NLOHMANN_JSON_PASTE12(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE11(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11)
+#define NLOHMANN_JSON_PASTE13(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE12(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12)
+#define NLOHMANN_JSON_PASTE14(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE13(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13)
+#define NLOHMANN_JSON_PASTE15(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE14(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14)
+#define NLOHMANN_JSON_PASTE16(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE15(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15)
+#define NLOHMANN_JSON_PASTE17(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE16(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16)
+#define NLOHMANN_JSON_PASTE18(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE17(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17)
+#define NLOHMANN_JSON_PASTE19(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE18(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18)
+#define NLOHMANN_JSON_PASTE20(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE19(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19)
+#define NLOHMANN_JSON_PASTE21(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE20(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20)
+#define NLOHMANN_JSON_PASTE22(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE21(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21)
+#define NLOHMANN_JSON_PASTE23(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE22(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22)
+#define NLOHMANN_JSON_PASTE24(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE23(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23)
+#define NLOHMANN_JSON_PASTE25(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE24(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24)
+#define NLOHMANN_JSON_PASTE26(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE25(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25)
+#define NLOHMANN_JSON_PASTE27(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE26(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26)
+#define NLOHMANN_JSON_PASTE28(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE27(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27)
+#define NLOHMANN_JSON_PASTE29(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE28(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28)
+#define NLOHMANN_JSON_PASTE30(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE29(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29)
+#define NLOHMANN_JSON_PASTE31(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE30(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30)
+#define NLOHMANN_JSON_PASTE32(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE31(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31)
+#define NLOHMANN_JSON_PASTE33(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE32(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32)
+#define NLOHMANN_JSON_PASTE34(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE33(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33)
+#define NLOHMANN_JSON_PASTE35(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE34(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34)
+#define NLOHMANN_JSON_PASTE36(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE35(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35)
+#define NLOHMANN_JSON_PASTE37(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE36(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36)
+#define NLOHMANN_JSON_PASTE38(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE37(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37)
+#define NLOHMANN_JSON_PASTE39(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE38(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38)
+#define NLOHMANN_JSON_PASTE40(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE39(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39)
+#define NLOHMANN_JSON_PASTE41(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE40(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40)
+#define NLOHMANN_JSON_PASTE42(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE41(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41)
+#define NLOHMANN_JSON_PASTE43(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE42(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42)
+#define NLOHMANN_JSON_PASTE44(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE43(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43)
+#define NLOHMANN_JSON_PASTE45(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE44(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44)
+#define NLOHMANN_JSON_PASTE46(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE45(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45)
+#define NLOHMANN_JSON_PASTE47(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE46(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46)
+#define NLOHMANN_JSON_PASTE48(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE47(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47)
+#define NLOHMANN_JSON_PASTE49(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE48(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48)
+#define NLOHMANN_JSON_PASTE50(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE49(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49)
+#define NLOHMANN_JSON_PASTE51(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE50(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50)
+#define NLOHMANN_JSON_PASTE52(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE51(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51)
+#define NLOHMANN_JSON_PASTE53(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE52(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52)
+#define NLOHMANN_JSON_PASTE54(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE53(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53)
+#define NLOHMANN_JSON_PASTE55(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE54(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54)
+#define NLOHMANN_JSON_PASTE56(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE55(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55)
+#define NLOHMANN_JSON_PASTE57(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE56(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56)
+#define NLOHMANN_JSON_PASTE58(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE57(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57)
+#define NLOHMANN_JSON_PASTE59(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE58(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58)
+#define NLOHMANN_JSON_PASTE60(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE59(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59)
+#define NLOHMANN_JSON_PASTE61(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59, v60) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE60(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59, v60)
+#define NLOHMANN_JSON_PASTE62(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59, v60, v61) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE61(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59, v60, v61)
+#define NLOHMANN_JSON_PASTE63(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59, v60, v61, v62) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE62(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59, v60, v61, v62)
+#define NLOHMANN_JSON_PASTE64(func, v1, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59, v60, v61, v62, v63) NLOHMANN_JSON_PASTE2(func, v1) NLOHMANN_JSON_PASTE63(func, v2, v3, v4, v5, v6, v7, v8, v9, v10, v11, v12, v13, v14, v15, v16, v17, v18, v19, v20, v21, v22, v23, v24, v25, v26, v27, v28, v29, v30, v31, v32, v33, v34, v35, v36, v37, v38, v39, v40, v41, v42, v43, v44, v45, v46, v47, v48, v49, v50, v51, v52, v53, v54, v55, v56, v57, v58, v59, v60, v61, v62, v63)
+
+#define NLOHMANN_JSON_TO(v1) nlohmann_json_j[#v1] = nlohmann_json_t.v1;
+#define NLOHMANN_JSON_FROM(v1) nlohmann_json_j.at(#v1).get_to(nlohmann_json_t.v1);
+
+/*!
+@brief macro
+@def NLOHMANN_DEFINE_TYPE_INTRUSIVE
+@since version 3.9.0
+*/
+#define NLOHMANN_DEFINE_TYPE_INTRUSIVE(Type, ...)  \
+    friend void to_json(nlohmann::json& nlohmann_json_j, const Type& nlohmann_json_t) { NLOHMANN_JSON_EXPAND(NLOHMANN_JSON_PASTE(NLOHMANN_JSON_TO, __VA_ARGS__)) } \
+    friend void from_json(const nlohmann::json& nlohmann_json_j, Type& nlohmann_json_t) { NLOHMANN_JSON_EXPAND(NLOHMANN_JSON_PASTE(NLOHMANN_JSON_FROM, __VA_ARGS__)) }
+
+/*!
+@brief macro
+@def NLOHMANN_DEFINE_TYPE_NON_INTRUSIVE
+@since version 3.9.0
+*/
+#define NLOHMANN_DEFINE_TYPE_NON_INTRUSIVE(Type, ...)  \
+    inline void to_json(nlohmann::json& nlohmann_json_j, const Type& nlohmann_json_t) { NLOHMANN_JSON_EXPAND(NLOHMANN_JSON_PASTE(NLOHMANN_JSON_TO, __VA_ARGS__)) } \
+    inline void from_json(const nlohmann::json& nlohmann_json_j, Type& nlohmann_json_t) { NLOHMANN_JSON_EXPAND(NLOHMANN_JSON_PASTE(NLOHMANN_JSON_FROM, __VA_ARGS__)) }
+
+
+// inspired from https://stackoverflow.com/a/26745591
+// allows to call any std function as if (e.g. with begin):
+// using std::begin; begin(x);
+//
+// it allows using the detected idiom to retrieve the return type
+// of such an expression
+#define NLOHMANN_CAN_CALL_STD_FUNC_IMPL(std_name)                                 \
+    namespace detail {                                                            \
+    using std::std_name;                                                          \
+    \
+    template<typename... T>                                                       \
+    using result_of_##std_name = decltype(std_name(std::declval<T>()...));        \
+    }                                                                             \
+    \
+    namespace detail2 {                                                           \
+    struct std_name##_tag                                                         \
+    {                                                                             \
+    };                                                                            \
+    \
+    template<typename... T>                                                       \
+    std_name##_tag std_name(T&&...);                                              \
+    \
+    template<typename... T>                                                       \
+    using result_of_##std_name = decltype(std_name(std::declval<T>()...));        \
+    \
+    template<typename... T>                                                       \
+    struct would_call_std_##std_name                                              \
+    {                                                                             \
+        static constexpr auto const value = ::nlohmann::detail::                  \
+                                            is_detected_exact<std_name##_tag, result_of_##std_name, T...>::value; \
+    };                                                                            \
+    } /* namespace detail2 */ \
+    \
+    template<typename... T>                                                       \
+    struct would_call_std_##std_name : detail2::would_call_std_##std_name<T...>   \
+    {                                                                             \
+    }
+
+#ifndef JSON_USE_IMPLICIT_CONVERSIONS
+    #define JSON_USE_IMPLICIT_CONVERSIONS 1
+#endif
+
+#if JSON_USE_IMPLICIT_CONVERSIONS
+    #define JSON_EXPLICIT
+#else
+    #define JSON_EXPLICIT explicit
+#endif
+
+#ifndef JSON_DIAGNOSTICS
+    #define JSON_DIAGNOSTICS 0
+#endif
+
+
+namespace nlohmann
+{
+namespace detail
+{
+
+/*!
+@brief replace all occurrences of a substring by another string
+
+@param[in,out] s  the string to manipulate; changed so that all
+               occurrences of @a f are replaced with @a t
+@param[in]     f  the substring to replace with @a t
+@param[in]     t  the string to replace @a f
+
+@pre The search string @a f must not be empty. **This precondition is
+enforced with an assertion.**
+
+@since version 2.0.0
+*/
+inline void replace_substring(std::string& s, const std::string& f,
+                              const std::string& t)
+{
+    JSON_ASSERT(!f.empty());
+    for (auto pos = s.find(f);                // find first occurrence of f
+            pos != std::string::npos;         // make sure f was found
+            s.replace(pos, f.size(), t),      // replace with t, and
+            pos = s.find(f, pos + t.size()))  // find next occurrence of f
+    {}
+}
+
+/*!
+ * @brief string escaping as described in RFC 6901 (Sect. 4)
+ * @param[in] s string to escape
+ * @return    escaped string
+ *
+ * Note the order of escaping "~" to "~0" and "/" to "~1" is important.
+ */
+inline std::string escape(std::string s)
+{
+    replace_substring(s, "~", "~0");
+    replace_substring(s, "/", "~1");
+    return s;
+}
+
+/*!
+ * @brief string unescaping as described in RFC 6901 (Sect. 4)
+ * @param[in] s string to unescape
+ * @return    unescaped string
+ *
+ * Note the order of escaping "~1" to "/" and "~0" to "~" is important.
+ */
+static void unescape(std::string& s)
+{
+    replace_substring(s, "~1", "/");
+    replace_substring(s, "~0", "~");
+}
+
+} // namespace detail
+} // namespace nlohmann
+
+// #include <nlohmann/detail/input/position_t.hpp>
+
+
+#include <cstddef> // size_t
+
+namespace nlohmann
+{
+namespace detail
+{
+/// struct to capture the start position of the current token
+struct position_t
+{
+    /// the total number of characters read
+    std::size_t chars_read_total = 0;
+    /// the number of characters read in the current line
+    std::size_t chars_read_current_line = 0;
+    /// the number of lines read
+    std::size_t lines_read = 0;
+
+    /// conversion to size_t to preserve SAX interface
+    constexpr operator size_t() const
+    {
+        return chars_read_total;
+    }
+};
+
+} // namespace detail
+} // namespace nlohmann
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+////////////////
+// exceptions //
+////////////////
+
+/// @brief general exception of the @ref basic_json class
+/// @sa https://json.nlohmann.me/api/basic_json/exception/
+class exception : public std::exception
+{
+  public:
+    /// returns the explanatory string
+    const char* what() const noexcept override
+    {
+        return m.what();
+    }
+
+    /// the id of the exception
+    const int id; // NOLINT(cppcoreguidelines-non-private-member-variables-in-classes)
+
+  protected:
+    JSON_HEDLEY_NON_NULL(3)
+    exception(int id_, const char* what_arg) : id(id_), m(what_arg) {} // NOLINT(bugprone-throw-keyword-missing)
+
+    static std::string name(const std::string& ename, int id_)
+    {
+        return "[json.exception." + ename + "." + std::to_string(id_) + "] ";
+    }
+
+    template<typename BasicJsonType>
+    static std::string diagnostics(const BasicJsonType& leaf_element)
+    {
+#if JSON_DIAGNOSTICS
+        std::vector<std::string> tokens;
+        for (const auto* current = &leaf_element; current->m_parent != nullptr; current = current->m_parent)
+        {
+            switch (current->m_parent->type())
+            {
+                case value_t::array:
+                {
+                    for (std::size_t i = 0; i < current->m_parent->m_value.array->size(); ++i)
+                    {
+                        if (&current->m_parent->m_value.array->operator[](i) == current)
+                        {
+                            tokens.emplace_back(std::to_string(i));
+                            break;
+                        }
+                    }
+                    break;
+                }
+
+                case value_t::object:
+                {
+                    for (const auto& element : *current->m_parent->m_value.object)
+                    {
+                        if (&element.second == current)
+                        {
+                            tokens.emplace_back(element.first.c_str());
+                            break;
+                        }
+                    }
+                    break;
+                }
+
+                case value_t::null: // LCOV_EXCL_LINE
+                case value_t::string: // LCOV_EXCL_LINE
+                case value_t::boolean: // LCOV_EXCL_LINE
+                case value_t::number_integer: // LCOV_EXCL_LINE
+                case value_t::number_unsigned: // LCOV_EXCL_LINE
+                case value_t::number_float: // LCOV_EXCL_LINE
+                case value_t::binary: // LCOV_EXCL_LINE
+                case value_t::discarded: // LCOV_EXCL_LINE
+                default:   // LCOV_EXCL_LINE
+                    break; // LCOV_EXCL_LINE
+            }
+        }
+
+        if (tokens.empty())
+        {
+            return "";
+        }
+
+        return "(" + std::accumulate(tokens.rbegin(), tokens.rend(), std::string{},
+                                     [](const std::string & a, const std::string & b)
+        {
+            return a + "/" + detail::escape(b);
+        }) + ") ";
+#else
+        static_cast<void>(leaf_element);
+        return "";
+#endif
+    }
+
+  private:
+    /// an exception object as storage for error messages
+    std::runtime_error m;
+};
+
+/// @brief exception indicating a parse error
+/// @sa https://json.nlohmann.me/api/basic_json/parse_error/
+class parse_error : public exception
+{
+  public:
+    /*!
+    @brief create a parse error exception
+    @param[in] id_       the id of the exception
+    @param[in] pos       the position where the error occurred (or with
+                         chars_read_total=0 if the position cannot be
+                         determined)
+    @param[in] what_arg  the explanatory string
+    @return parse_error object
+    */
+    template<typename BasicJsonType>
+    static parse_error create(int id_, const position_t& pos, const std::string& what_arg, const BasicJsonType& context)
+    {
+        std::string w = exception::name("parse_error", id_) + "parse error" +
+                        position_string(pos) + ": " + exception::diagnostics(context) + what_arg;
+        return {id_, pos.chars_read_total, w.c_str()};
+    }
+
+    template<typename BasicJsonType>
+    static parse_error create(int id_, std::size_t byte_, const std::string& what_arg, const BasicJsonType& context)
+    {
+        std::string w = exception::name("parse_error", id_) + "parse error" +
+                        (byte_ != 0 ? (" at byte " + std::to_string(byte_)) : "") +
+                        ": " + exception::diagnostics(context) + what_arg;
+        return {id_, byte_, w.c_str()};
+    }
+
+    /*!
+    @brief byte index of the parse error
+
+    The byte index of the last read character in the input file.
+
+    @note For an input with n bytes, 1 is the index of the first character and
+          n+1 is the index of the terminating null byte or the end of file.
+          This also holds true when reading a byte vector (CBOR or MessagePack).
+    */
+    const std::size_t byte;
+
+  private:
+    parse_error(int id_, std::size_t byte_, const char* what_arg)
+        : exception(id_, what_arg), byte(byte_) {}
+
+    static std::string position_string(const position_t& pos)
+    {
+        return " at line " + std::to_string(pos.lines_read + 1) +
+               ", column " + std::to_string(pos.chars_read_current_line);
+    }
+};
+
+/// @brief exception indicating errors with iterators
+/// @sa https://json.nlohmann.me/api/basic_json/invalid_iterator/
+class invalid_iterator : public exception
+{
+  public:
+    template<typename BasicJsonType>
+    static invalid_iterator create(int id_, const std::string& what_arg, const BasicJsonType& context)
+    {
+        std::string w = exception::name("invalid_iterator", id_) + exception::diagnostics(context) + what_arg;
+        return {id_, w.c_str()};
+    }
+
+  private:
+    JSON_HEDLEY_NON_NULL(3)
+    invalid_iterator(int id_, const char* what_arg)
+        : exception(id_, what_arg) {}
+};
+
+/// @brief exception indicating executing a member function with a wrong type
+/// @sa https://json.nlohmann.me/api/basic_json/type_error/
+class type_error : public exception
+{
+  public:
+    template<typename BasicJsonType>
+    static type_error create(int id_, const std::string& what_arg, const BasicJsonType& context)
+    {
+        std::string w = exception::name("type_error", id_) + exception::diagnostics(context) + what_arg;
+        return {id_, w.c_str()};
+    }
+
+  private:
+    JSON_HEDLEY_NON_NULL(3)
+    type_error(int id_, const char* what_arg) : exception(id_, what_arg) {}
+};
+
+/// @brief exception indicating access out of the defined range
+/// @sa https://json.nlohmann.me/api/basic_json/out_of_range/
+class out_of_range : public exception
+{
+  public:
+    template<typename BasicJsonType>
+    static out_of_range create(int id_, const std::string& what_arg, const BasicJsonType& context)
+    {
+        std::string w = exception::name("out_of_range", id_) + exception::diagnostics(context) + what_arg;
+        return {id_, w.c_str()};
+    }
+
+  private:
+    JSON_HEDLEY_NON_NULL(3)
+    out_of_range(int id_, const char* what_arg) : exception(id_, what_arg) {}
+};
+
+/// @brief exception indicating other library errors
+/// @sa https://json.nlohmann.me/api/basic_json/other_error/
+class other_error : public exception
+{
+  public:
+    template<typename BasicJsonType>
+    static other_error create(int id_, const std::string& what_arg, const BasicJsonType& context)
+    {
+        std::string w = exception::name("other_error", id_) + exception::diagnostics(context) + what_arg;
+        return {id_, w.c_str()};
+    }
+
+  private:
+    JSON_HEDLEY_NON_NULL(3)
+    other_error(int id_, const char* what_arg) : exception(id_, what_arg) {}
+};
+
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/meta/cpp_future.hpp>
+
+
+#include <cstddef> // size_t
+#include <type_traits> // conditional, enable_if, false_type, integral_constant, is_constructible, is_integral, is_same, remove_cv, remove_reference, true_type
+#include <utility> // index_sequence, make_index_sequence, index_sequence_for
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+
+template<typename T>
+using uncvref_t = typename std::remove_cv<typename std::remove_reference<T>::type>::type;
+
+#ifdef JSON_HAS_CPP_14
+
+// the following utilities are natively available in C++14
+using std::enable_if_t;
+using std::index_sequence;
+using std::make_index_sequence;
+using std::index_sequence_for;
+
+#else
+
+// alias templates to reduce boilerplate
+template<bool B, typename T = void>
+using enable_if_t = typename std::enable_if<B, T>::type;
+
+// The following code is taken from https://github.com/abseil/abseil-cpp/blob/10cb35e459f5ecca5b2ff107635da0bfa41011b4/absl/utility/utility.h
+// which is part of Google Abseil (https://github.com/abseil/abseil-cpp), licensed under the Apache License 2.0.
+
+//// START OF CODE FROM GOOGLE ABSEIL
+
+// integer_sequence
+//
+// Class template representing a compile-time integer sequence. An instantiation
+// of `integer_sequence<T, Ints...>` has a sequence of integers encoded in its
+// type through its template arguments (which is a common need when
+// working with C++11 variadic templates). `absl::integer_sequence` is designed
+// to be a drop-in replacement for C++14's `std::integer_sequence`.
+//
+// Example:
+//
+//   template< class T, T... Ints >
+//   void user_function(integer_sequence<T, Ints...>);
+//
+//   int main()
+//   {
+//     // user_function's `T` will be deduced to `int` and `Ints...`
+//     // will be deduced to `0, 1, 2, 3, 4`.
+//     user_function(make_integer_sequence<int, 5>());
+//   }
+template <typename T, T... Ints>
+struct integer_sequence
+{
+    using value_type = T;
+    static constexpr std::size_t size() noexcept
+    {
+        return sizeof...(Ints);
+    }
+};
+
+// index_sequence
+//
+// A helper template for an `integer_sequence` of `size_t`,
+// `absl::index_sequence` is designed to be a drop-in replacement for C++14's
+// `std::index_sequence`.
+template <size_t... Ints>
+using index_sequence = integer_sequence<size_t, Ints...>;
+
+namespace utility_internal
+{
+
+template <typename Seq, size_t SeqSize, size_t Rem>
+struct Extend;
+
+// Note that SeqSize == sizeof...(Ints). It's passed explicitly for efficiency.
+template <typename T, T... Ints, size_t SeqSize>
+struct Extend<integer_sequence<T, Ints...>, SeqSize, 0>
+{
+    using type = integer_sequence < T, Ints..., (Ints + SeqSize)... >;
+};
+
+template <typename T, T... Ints, size_t SeqSize>
+struct Extend<integer_sequence<T, Ints...>, SeqSize, 1>
+{
+    using type = integer_sequence < T, Ints..., (Ints + SeqSize)..., 2 * SeqSize >;
+};
+
+// Recursion helper for 'make_integer_sequence<T, N>'.
+// 'Gen<T, N>::type' is an alias for 'integer_sequence<T, 0, 1, ... N-1>'.
+template <typename T, size_t N>
+struct Gen
+{
+    using type =
+        typename Extend < typename Gen < T, N / 2 >::type, N / 2, N % 2 >::type;
+};
+
+template <typename T>
+struct Gen<T, 0>
+{
+    using type = integer_sequence<T>;
+};
+
+}  // namespace utility_internal
+
+// Compile-time sequences of integers
+
+// make_integer_sequence
+//
+// This template alias is equivalent to
+// `integer_sequence<int, 0, 1, ..., N-1>`, and is designed to be a drop-in
+// replacement for C++14's `std::make_integer_sequence`.
+template <typename T, T N>
+using make_integer_sequence = typename utility_internal::Gen<T, N>::type;
+
+// make_index_sequence
+//
+// This template alias is equivalent to `index_sequence<0, 1, ..., N-1>`,
+// and is designed to be a drop-in replacement for C++14's
+// `std::make_index_sequence`.
+template <size_t N>
+using make_index_sequence = make_integer_sequence<size_t, N>;
+
+// index_sequence_for
+//
+// Converts a typename pack into an index sequence of the same length, and
+// is designed to be a drop-in replacement for C++14's
+// `std::index_sequence_for()`
+template <typename... Ts>
+using index_sequence_for = make_index_sequence<sizeof...(Ts)>;
+
+//// END OF CODE FROM GOOGLE ABSEIL
+
+#endif
+
+// dispatch utility (taken from ranges-v3)
+template<unsigned N> struct priority_tag : priority_tag < N - 1 > {};
+template<> struct priority_tag<0> {};
+
+// taken from ranges-v3
+template<typename T>
+struct static_const
+{
+    static constexpr T value{};
+};
+
+template<typename T>
+constexpr T static_const<T>::value; // NOLINT(readability-redundant-declaration)
+
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/meta/identity_tag.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+// dispatching helper struct
+template <class T> struct identity_tag {};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+
+#include <limits> // numeric_limits
+#include <type_traits> // false_type, is_constructible, is_integral, is_same, true_type
+#include <utility> // declval
+#include <tuple> // tuple
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+// #include <nlohmann/detail/iterators/iterator_traits.hpp>
+
+
+#include <iterator> // random_access_iterator_tag
+
+// #include <nlohmann/detail/meta/void_t.hpp>
+
+// #include <nlohmann/detail/meta/cpp_future.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+template<typename It, typename = void>
+struct iterator_types {};
+
+template<typename It>
+struct iterator_types <
+    It,
+    void_t<typename It::difference_type, typename It::value_type, typename It::pointer,
+    typename It::reference, typename It::iterator_category >>
+{
+    using difference_type = typename It::difference_type;
+    using value_type = typename It::value_type;
+    using pointer = typename It::pointer;
+    using reference = typename It::reference;
+    using iterator_category = typename It::iterator_category;
+};
+
+// This is required as some compilers implement std::iterator_traits in a way that
+// doesn't work with SFINAE. See https://github.com/nlohmann/json/issues/1341.
+template<typename T, typename = void>
+struct iterator_traits
+{
+};
+
+template<typename T>
+struct iterator_traits < T, enable_if_t < !std::is_pointer<T>::value >>
+            : iterator_types<T>
+{
+};
+
+template<typename T>
+struct iterator_traits<T*, enable_if_t<std::is_object<T>::value>>
+{
+    using iterator_category = std::random_access_iterator_tag;
+    using value_type = T;
+    using difference_type = ptrdiff_t;
+    using pointer = T*;
+    using reference = T&;
+};
+} // namespace detail
+} // namespace nlohmann
+
+// #include <nlohmann/detail/meta/call_std/begin.hpp>
+
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+NLOHMANN_CAN_CALL_STD_FUNC_IMPL(begin);
+} // namespace nlohmann
+
+// #include <nlohmann/detail/meta/call_std/end.hpp>
+
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+NLOHMANN_CAN_CALL_STD_FUNC_IMPL(end);
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/meta/cpp_future.hpp>
+
+// #include <nlohmann/detail/meta/detected.hpp>
+
+// #include <nlohmann/json_fwd.hpp>
+#ifndef INCLUDE_NLOHMANN_JSON_FWD_HPP_
+#define INCLUDE_NLOHMANN_JSON_FWD_HPP_
+
+#include <cstdint> // int64_t, uint64_t
+#include <map> // map
+#include <memory> // allocator
+#include <string> // string
+#include <vector> // vector
+
+/*!
+@brief namespace for Niels Lohmann
+@see https://github.com/nlohmann
+@since version 1.0.0
+*/
+namespace nlohmann
+{
+/*!
+@brief default JSONSerializer template argument
+
+This serializer ignores the template arguments and uses ADL
+([argument-dependent lookup](https://en.cppreference.com/w/cpp/language/adl))
+for serialization.
+*/
+template<typename T = void, typename SFINAE = void>
+struct adl_serializer;
+
+/// a class to store JSON values
+/// @sa https://json.nlohmann.me/api/basic_json/
+template<template<typename U, typename V, typename... Args> class ObjectType =
+         std::map,
+         template<typename U, typename... Args> class ArrayType = std::vector,
+         class StringType = std::string, class BooleanType = bool,
+         class NumberIntegerType = std::int64_t,
+         class NumberUnsignedType = std::uint64_t,
+         class NumberFloatType = double,
+         template<typename U> class AllocatorType = std::allocator,
+         template<typename T, typename SFINAE = void> class JSONSerializer =
+         adl_serializer,
+         class BinaryType = std::vector<std::uint8_t>>
+class basic_json;
+
+/// @brief JSON Pointer defines a string syntax for identifying a specific value within a JSON document
+/// @sa https://json.nlohmann.me/api/json_pointer/
+template<typename BasicJsonType>
+class json_pointer;
+
+/*!
+@brief default specialization
+@sa https://json.nlohmann.me/api/json/
+*/
+using json = basic_json<>;
+
+/// @brief a minimal map-like container that preserves insertion order
+/// @sa https://json.nlohmann.me/api/ordered_map/
+template<class Key, class T, class IgnoredLess, class Allocator>
+struct ordered_map;
+
+/// @brief specialization that maintains the insertion order of object keys
+/// @sa https://json.nlohmann.me/api/ordered_json/
+using ordered_json = basic_json<nlohmann::ordered_map>;
+
+}  // namespace nlohmann
+
+#endif  // INCLUDE_NLOHMANN_JSON_FWD_HPP_
+
+
+namespace nlohmann
+{
+/*!
+@brief detail namespace with internal helper functions
+
+This namespace collects functions that should not be exposed,
+implementations of some @ref basic_json methods, and meta-programming helpers.
+
+@since version 2.1.0
+*/
+namespace detail
+{
+/////////////
+// helpers //
+/////////////
+
+// Note to maintainers:
+//
+// Every trait in this file expects a non CV-qualified type.
+// The only exceptions are in the 'aliases for detected' section
+// (i.e. those of the form: decltype(T::member_function(std::declval<T>())))
+//
+// In this case, T has to be properly CV-qualified to constraint the function arguments
+// (e.g. to_json(BasicJsonType&, const T&))
+
+template<typename> struct is_basic_json : std::false_type {};
+
+NLOHMANN_BASIC_JSON_TPL_DECLARATION
+struct is_basic_json<NLOHMANN_BASIC_JSON_TPL> : std::true_type {};
+
+//////////////////////
+// json_ref helpers //
+//////////////////////
+
+template<typename>
+class json_ref;
+
+template<typename>
+struct is_json_ref : std::false_type {};
+
+template<typename T>
+struct is_json_ref<json_ref<T>> : std::true_type {};
+
+//////////////////////////
+// aliases for detected //
+//////////////////////////
+
+template<typename T>
+using mapped_type_t = typename T::mapped_type;
+
+template<typename T>
+using key_type_t = typename T::key_type;
+
+template<typename T>
+using value_type_t = typename T::value_type;
+
+template<typename T>
+using difference_type_t = typename T::difference_type;
+
+template<typename T>
+using pointer_t = typename T::pointer;
+
+template<typename T>
+using reference_t = typename T::reference;
+
+template<typename T>
+using iterator_category_t = typename T::iterator_category;
+
+template<typename T, typename... Args>
+using to_json_function = decltype(T::to_json(std::declval<Args>()...));
+
+template<typename T, typename... Args>
+using from_json_function = decltype(T::from_json(std::declval<Args>()...));
+
+template<typename T, typename U>
+using get_template_function = decltype(std::declval<T>().template get<U>());
+
+// trait checking if JSONSerializer<T>::from_json(json const&, udt&) exists
+template<typename BasicJsonType, typename T, typename = void>
+struct has_from_json : std::false_type {};
+
+// trait checking if j.get<T> is valid
+// use this trait instead of std::is_constructible or std::is_convertible,
+// both rely on, or make use of implicit conversions, and thus fail when T
+// has several constructors/operator= (see https://github.com/nlohmann/json/issues/958)
+template <typename BasicJsonType, typename T>
+struct is_getable
+{
+    static constexpr bool value = is_detected<get_template_function, const BasicJsonType&, T>::value;
+};
+
+template<typename BasicJsonType, typename T>
+struct has_from_json < BasicJsonType, T, enable_if_t < !is_basic_json<T>::value >>
+{
+    using serializer = typename BasicJsonType::template json_serializer<T, void>;
+
+    static constexpr bool value =
+        is_detected_exact<void, from_json_function, serializer,
+        const BasicJsonType&, T&>::value;
+};
+
+// This trait checks if JSONSerializer<T>::from_json(json const&) exists
+// this overload is used for non-default-constructible user-defined-types
+template<typename BasicJsonType, typename T, typename = void>
+struct has_non_default_from_json : std::false_type {};
+
+template<typename BasicJsonType, typename T>
+struct has_non_default_from_json < BasicJsonType, T, enable_if_t < !is_basic_json<T>::value >>
+{
+    using serializer = typename BasicJsonType::template json_serializer<T, void>;
+
+    static constexpr bool value =
+        is_detected_exact<T, from_json_function, serializer,
+        const BasicJsonType&>::value;
+};
+
+// This trait checks if BasicJsonType::json_serializer<T>::to_json exists
+// Do not evaluate the trait when T is a basic_json type, to avoid template instantiation infinite recursion.
+template<typename BasicJsonType, typename T, typename = void>
+struct has_to_json : std::false_type {};
+
+template<typename BasicJsonType, typename T>
+struct has_to_json < BasicJsonType, T, enable_if_t < !is_basic_json<T>::value >>
+{
+    using serializer = typename BasicJsonType::template json_serializer<T, void>;
+
+    static constexpr bool value =
+        is_detected_exact<void, to_json_function, serializer, BasicJsonType&,
+        T>::value;
+};
+
+
+///////////////////
+// is_ functions //
+///////////////////
+
+// https://en.cppreference.com/w/cpp/types/conjunction
+template<class...> struct conjunction : std::true_type { };
+template<class B1> struct conjunction<B1> : B1 { };
+template<class B1, class... Bn>
+struct conjunction<B1, Bn...>
+: std::conditional<bool(B1::value), conjunction<Bn...>, B1>::type {};
+
+// https://en.cppreference.com/w/cpp/types/negation
+template<class B> struct negation : std::integral_constant < bool, !B::value > { };
+
+// Reimplementation of is_constructible and is_default_constructible, due to them being broken for
+// std::pair and std::tuple until LWG 2367 fix (see https://cplusplus.github.io/LWG/lwg-defects.html#2367).
+// This causes compile errors in e.g. clang 3.5 or gcc 4.9.
+template <typename T>
+struct is_default_constructible : std::is_default_constructible<T> {};
+
+template <typename T1, typename T2>
+struct is_default_constructible<std::pair<T1, T2>>
+            : conjunction<is_default_constructible<T1>, is_default_constructible<T2>> {};
+
+template <typename T1, typename T2>
+struct is_default_constructible<const std::pair<T1, T2>>
+            : conjunction<is_default_constructible<T1>, is_default_constructible<T2>> {};
+
+template <typename... Ts>
+struct is_default_constructible<std::tuple<Ts...>>
+            : conjunction<is_default_constructible<Ts>...> {};
+
+template <typename... Ts>
+struct is_default_constructible<const std::tuple<Ts...>>
+            : conjunction<is_default_constructible<Ts>...> {};
+
+
+template <typename T, typename... Args>
+struct is_constructible : std::is_constructible<T, Args...> {};
+
+template <typename T1, typename T2>
+struct is_constructible<std::pair<T1, T2>> : is_default_constructible<std::pair<T1, T2>> {};
+
+template <typename T1, typename T2>
+struct is_constructible<const std::pair<T1, T2>> : is_default_constructible<const std::pair<T1, T2>> {};
+
+template <typename... Ts>
+struct is_constructible<std::tuple<Ts...>> : is_default_constructible<std::tuple<Ts...>> {};
+
+template <typename... Ts>
+struct is_constructible<const std::tuple<Ts...>> : is_default_constructible<const std::tuple<Ts...>> {};
+
+
+template<typename T, typename = void>
+struct is_iterator_traits : std::false_type {};
+
+template<typename T>
+struct is_iterator_traits<iterator_traits<T>>
+{
+  private:
+    using traits = iterator_traits<T>;
+
+  public:
+    static constexpr auto value =
+        is_detected<value_type_t, traits>::value &&
+        is_detected<difference_type_t, traits>::value &&
+        is_detected<pointer_t, traits>::value &&
+        is_detected<iterator_category_t, traits>::value &&
+        is_detected<reference_t, traits>::value;
+};
+
+template<typename T>
+struct is_range
+{
+  private:
+    using t_ref = typename std::add_lvalue_reference<T>::type;
+
+    using iterator = detected_t<result_of_begin, t_ref>;
+    using sentinel = detected_t<result_of_end, t_ref>;
+
+    // to be 100% correct, it should use https://en.cppreference.com/w/cpp/iterator/input_or_output_iterator
+    // and https://en.cppreference.com/w/cpp/iterator/sentinel_for
+    // but reimplementing these would be too much work, as a lot of other concepts are used underneath
+    static constexpr auto is_iterator_begin =
+        is_iterator_traits<iterator_traits<iterator>>::value;
+
+  public:
+    static constexpr bool value = !std::is_same<iterator, nonesuch>::value && !std::is_same<sentinel, nonesuch>::value && is_iterator_begin;
+};
+
+template<typename R>
+using iterator_t = enable_if_t<is_range<R>::value, result_of_begin<decltype(std::declval<R&>())>>;
+
+template<typename T>
+using range_value_t = value_type_t<iterator_traits<iterator_t<T>>>;
+
+// The following implementation of is_complete_type is taken from
+// https://blogs.msdn.microsoft.com/vcblog/2015/12/02/partial-support-for-expression-sfinae-in-vs-2015-update-1/
+// and is written by Xiang Fan who agreed to using it in this library.
+
+template<typename T, typename = void>
+struct is_complete_type : std::false_type {};
+
+template<typename T>
+struct is_complete_type<T, decltype(void(sizeof(T)))> : std::true_type {};
+
+template<typename BasicJsonType, typename CompatibleObjectType,
+         typename = void>
+struct is_compatible_object_type_impl : std::false_type {};
+
+template<typename BasicJsonType, typename CompatibleObjectType>
+struct is_compatible_object_type_impl <
+    BasicJsonType, CompatibleObjectType,
+    enable_if_t < is_detected<mapped_type_t, CompatibleObjectType>::value&&
+    is_detected<key_type_t, CompatibleObjectType>::value >>
+{
+    using object_t = typename BasicJsonType::object_t;
+
+    // macOS's is_constructible does not play well with nonesuch...
+    static constexpr bool value =
+        is_constructible<typename object_t::key_type,
+        typename CompatibleObjectType::key_type>::value &&
+        is_constructible<typename object_t::mapped_type,
+        typename CompatibleObjectType::mapped_type>::value;
+};
+
+template<typename BasicJsonType, typename CompatibleObjectType>
+struct is_compatible_object_type
+    : is_compatible_object_type_impl<BasicJsonType, CompatibleObjectType> {};
+
+template<typename BasicJsonType, typename ConstructibleObjectType,
+         typename = void>
+struct is_constructible_object_type_impl : std::false_type {};
+
+template<typename BasicJsonType, typename ConstructibleObjectType>
+struct is_constructible_object_type_impl <
+    BasicJsonType, ConstructibleObjectType,
+    enable_if_t < is_detected<mapped_type_t, ConstructibleObjectType>::value&&
+    is_detected<key_type_t, ConstructibleObjectType>::value >>
+{
+    using object_t = typename BasicJsonType::object_t;
+
+    static constexpr bool value =
+        (is_default_constructible<ConstructibleObjectType>::value &&
+         (std::is_move_assignable<ConstructibleObjectType>::value ||
+          std::is_copy_assignable<ConstructibleObjectType>::value) &&
+         (is_constructible<typename ConstructibleObjectType::key_type,
+          typename object_t::key_type>::value &&
+          std::is_same <
+          typename object_t::mapped_type,
+          typename ConstructibleObjectType::mapped_type >::value)) ||
+        (has_from_json<BasicJsonType,
+         typename ConstructibleObjectType::mapped_type>::value ||
+         has_non_default_from_json <
+         BasicJsonType,
+         typename ConstructibleObjectType::mapped_type >::value);
+};
+
+template<typename BasicJsonType, typename ConstructibleObjectType>
+struct is_constructible_object_type
+    : is_constructible_object_type_impl<BasicJsonType,
+      ConstructibleObjectType> {};
+
+template<typename BasicJsonType, typename CompatibleStringType>
+struct is_compatible_string_type
+{
+    static constexpr auto value =
+        is_constructible<typename BasicJsonType::string_t, CompatibleStringType>::value;
+};
+
+template<typename BasicJsonType, typename ConstructibleStringType>
+struct is_constructible_string_type
+{
+    static constexpr auto value =
+        is_constructible<ConstructibleStringType,
+        typename BasicJsonType::string_t>::value;
+};
+
+template<typename BasicJsonType, typename CompatibleArrayType, typename = void>
+struct is_compatible_array_type_impl : std::false_type {};
+
+template<typename BasicJsonType, typename CompatibleArrayType>
+struct is_compatible_array_type_impl <
+    BasicJsonType, CompatibleArrayType,
+    enable_if_t <
+    is_detected<iterator_t, CompatibleArrayType>::value&&
+    is_iterator_traits<iterator_traits<detected_t<iterator_t, CompatibleArrayType>>>::value&&
+// special case for types like std::filesystem::path whose iterator's value_type are themselves
+// c.f. https://github.com/nlohmann/json/pull/3073
+    !std::is_same<CompatibleArrayType, detected_t<range_value_t, CompatibleArrayType>>::value >>
+{
+    static constexpr bool value =
+        is_constructible<BasicJsonType,
+        range_value_t<CompatibleArrayType>>::value;
+};
+
+template<typename BasicJsonType, typename CompatibleArrayType>
+struct is_compatible_array_type
+    : is_compatible_array_type_impl<BasicJsonType, CompatibleArrayType> {};
+
+template<typename BasicJsonType, typename ConstructibleArrayType, typename = void>
+struct is_constructible_array_type_impl : std::false_type {};
+
+template<typename BasicJsonType, typename ConstructibleArrayType>
+struct is_constructible_array_type_impl <
+    BasicJsonType, ConstructibleArrayType,
+    enable_if_t<std::is_same<ConstructibleArrayType,
+    typename BasicJsonType::value_type>::value >>
+            : std::true_type {};
+
+template<typename BasicJsonType, typename ConstructibleArrayType>
+struct is_constructible_array_type_impl <
+    BasicJsonType, ConstructibleArrayType,
+    enable_if_t < !std::is_same<ConstructibleArrayType,
+    typename BasicJsonType::value_type>::value&&
+    !is_compatible_string_type<BasicJsonType, ConstructibleArrayType>::value&&
+    is_default_constructible<ConstructibleArrayType>::value&&
+(std::is_move_assignable<ConstructibleArrayType>::value ||
+ std::is_copy_assignable<ConstructibleArrayType>::value)&&
+is_detected<iterator_t, ConstructibleArrayType>::value&&
+is_iterator_traits<iterator_traits<detected_t<iterator_t, ConstructibleArrayType>>>::value&&
+is_detected<range_value_t, ConstructibleArrayType>::value&&
+// special case for types like std::filesystem::path whose iterator's value_type are themselves
+// c.f. https://github.com/nlohmann/json/pull/3073
+!std::is_same<ConstructibleArrayType, detected_t<range_value_t, ConstructibleArrayType>>::value&&
+        is_complete_type <
+        detected_t<range_value_t, ConstructibleArrayType >>::value >>
+{
+    using value_type = range_value_t<ConstructibleArrayType>;
+
+    static constexpr bool value =
+        std::is_same<value_type,
+        typename BasicJsonType::array_t::value_type>::value ||
+        has_from_json<BasicJsonType,
+        value_type>::value ||
+        has_non_default_from_json <
+        BasicJsonType,
+        value_type >::value;
+};
+
+template<typename BasicJsonType, typename ConstructibleArrayType>
+struct is_constructible_array_type
+    : is_constructible_array_type_impl<BasicJsonType, ConstructibleArrayType> {};
+
+template<typename RealIntegerType, typename CompatibleNumberIntegerType,
+         typename = void>
+struct is_compatible_integer_type_impl : std::false_type {};
+
+template<typename RealIntegerType, typename CompatibleNumberIntegerType>
+struct is_compatible_integer_type_impl <
+    RealIntegerType, CompatibleNumberIntegerType,
+    enable_if_t < std::is_integral<RealIntegerType>::value&&
+    std::is_integral<CompatibleNumberIntegerType>::value&&
+    !std::is_same<bool, CompatibleNumberIntegerType>::value >>
+{
+    // is there an assert somewhere on overflows?
+    using RealLimits = std::numeric_limits<RealIntegerType>;
+    using CompatibleLimits = std::numeric_limits<CompatibleNumberIntegerType>;
+
+    static constexpr auto value =
+        is_constructible<RealIntegerType,
+        CompatibleNumberIntegerType>::value &&
+        CompatibleLimits::is_integer &&
+        RealLimits::is_signed == CompatibleLimits::is_signed;
+};
+
+template<typename RealIntegerType, typename CompatibleNumberIntegerType>
+struct is_compatible_integer_type
+    : is_compatible_integer_type_impl<RealIntegerType,
+      CompatibleNumberIntegerType> {};
+
+template<typename BasicJsonType, typename CompatibleType, typename = void>
+struct is_compatible_type_impl: std::false_type {};
+
+template<typename BasicJsonType, typename CompatibleType>
+struct is_compatible_type_impl <
+    BasicJsonType, CompatibleType,
+    enable_if_t<is_complete_type<CompatibleType>::value >>
+{
+    static constexpr bool value =
+        has_to_json<BasicJsonType, CompatibleType>::value;
+};
+
+template<typename BasicJsonType, typename CompatibleType>
+struct is_compatible_type
+    : is_compatible_type_impl<BasicJsonType, CompatibleType> {};
+
+template<typename T1, typename T2>
+struct is_constructible_tuple : std::false_type {};
+
+template<typename T1, typename... Args>
+struct is_constructible_tuple<T1, std::tuple<Args...>> : conjunction<is_constructible<T1, Args>...> {};
+
+// a naive helper to check if a type is an ordered_map (exploits the fact that
+// ordered_map inherits capacity() from std::vector)
+template <typename T>
+struct is_ordered_map
+{
+    using one = char;
+
+    struct two
+    {
+        char x[2]; // NOLINT(cppcoreguidelines-avoid-c-arrays,hicpp-avoid-c-arrays,modernize-avoid-c-arrays)
+    };
+
+    template <typename C> static one test( decltype(&C::capacity) ) ;
+    template <typename C> static two test(...);
+
+    enum { value = sizeof(test<T>(nullptr)) == sizeof(char) }; // NOLINT(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+};
+
+// to avoid useless casts (see https://github.com/nlohmann/json/issues/2893#issuecomment-889152324)
+template < typename T, typename U, enable_if_t < !std::is_same<T, U>::value, int > = 0 >
+T conditional_static_cast(U value)
+{
+    return static_cast<T>(value);
+}
+
+template<typename T, typename U, enable_if_t<std::is_same<T, U>::value, int> = 0>
+T conditional_static_cast(U value)
+{
+    return value;
+}
+
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+#if JSON_HAS_EXPERIMENTAL_FILESYSTEM
+#include <experimental/filesystem>
+namespace nlohmann::detail
+{
+namespace std_fs = std::experimental::filesystem;
+} // namespace nlohmann::detail
+#elif JSON_HAS_FILESYSTEM
+#include <filesystem>
+namespace nlohmann::detail
+{
+namespace std_fs = std::filesystem;
+} // namespace nlohmann::detail
+#endif
+
+namespace nlohmann
+{
+namespace detail
+{
+template<typename BasicJsonType>
+void from_json(const BasicJsonType& j, typename std::nullptr_t& n)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_null()))
+    {
+        JSON_THROW(type_error::create(302, "type must be null, but is " + std::string(j.type_name()), j));
+    }
+    n = nullptr;
+}
+
+// overloads for basic_json template parameters
+template < typename BasicJsonType, typename ArithmeticType,
+           enable_if_t < std::is_arithmetic<ArithmeticType>::value&&
+                         !std::is_same<ArithmeticType, typename BasicJsonType::boolean_t>::value,
+                         int > = 0 >
+void get_arithmetic_value(const BasicJsonType& j, ArithmeticType& val)
+{
+    switch (static_cast<value_t>(j))
+    {
+        case value_t::number_unsigned:
+        {
+            val = static_cast<ArithmeticType>(*j.template get_ptr<const typename BasicJsonType::number_unsigned_t*>());
+            break;
+        }
+        case value_t::number_integer:
+        {
+            val = static_cast<ArithmeticType>(*j.template get_ptr<const typename BasicJsonType::number_integer_t*>());
+            break;
+        }
+        case value_t::number_float:
+        {
+            val = static_cast<ArithmeticType>(*j.template get_ptr<const typename BasicJsonType::number_float_t*>());
+            break;
+        }
+
+        case value_t::null:
+        case value_t::object:
+        case value_t::array:
+        case value_t::string:
+        case value_t::boolean:
+        case value_t::binary:
+        case value_t::discarded:
+        default:
+            JSON_THROW(type_error::create(302, "type must be number, but is " + std::string(j.type_name()), j));
+    }
+}
+
+template<typename BasicJsonType>
+void from_json(const BasicJsonType& j, typename BasicJsonType::boolean_t& b)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_boolean()))
+    {
+        JSON_THROW(type_error::create(302, "type must be boolean, but is " + std::string(j.type_name()), j));
+    }
+    b = *j.template get_ptr<const typename BasicJsonType::boolean_t*>();
+}
+
+template<typename BasicJsonType>
+void from_json(const BasicJsonType& j, typename BasicJsonType::string_t& s)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_string()))
+    {
+        JSON_THROW(type_error::create(302, "type must be string, but is " + std::string(j.type_name()), j));
+    }
+    s = *j.template get_ptr<const typename BasicJsonType::string_t*>();
+}
+
+template <
+    typename BasicJsonType, typename ConstructibleStringType,
+    enable_if_t <
+        is_constructible_string_type<BasicJsonType, ConstructibleStringType>::value&&
+        !std::is_same<typename BasicJsonType::string_t,
+                      ConstructibleStringType>::value,
+        int > = 0 >
+void from_json(const BasicJsonType& j, ConstructibleStringType& s)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_string()))
+    {
+        JSON_THROW(type_error::create(302, "type must be string, but is " + std::string(j.type_name()), j));
+    }
+
+    s = *j.template get_ptr<const typename BasicJsonType::string_t*>();
+}
+
+template<typename BasicJsonType>
+void from_json(const BasicJsonType& j, typename BasicJsonType::number_float_t& val)
+{
+    get_arithmetic_value(j, val);
+}
+
+template<typename BasicJsonType>
+void from_json(const BasicJsonType& j, typename BasicJsonType::number_unsigned_t& val)
+{
+    get_arithmetic_value(j, val);
+}
+
+template<typename BasicJsonType>
+void from_json(const BasicJsonType& j, typename BasicJsonType::number_integer_t& val)
+{
+    get_arithmetic_value(j, val);
+}
+
+template<typename BasicJsonType, typename EnumType,
+         enable_if_t<std::is_enum<EnumType>::value, int> = 0>
+void from_json(const BasicJsonType& j, EnumType& e)
+{
+    typename std::underlying_type<EnumType>::type val;
+    get_arithmetic_value(j, val);
+    e = static_cast<EnumType>(val);
+}
+
+// forward_list doesn't have an insert method
+template<typename BasicJsonType, typename T, typename Allocator,
+         enable_if_t<is_getable<BasicJsonType, T>::value, int> = 0>
+void from_json(const BasicJsonType& j, std::forward_list<T, Allocator>& l)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_array()))
+    {
+        JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(j.type_name()), j));
+    }
+    l.clear();
+    std::transform(j.rbegin(), j.rend(),
+                   std::front_inserter(l), [](const BasicJsonType & i)
+    {
+        return i.template get<T>();
+    });
+}
+
+// valarray doesn't have an insert method
+template<typename BasicJsonType, typename T,
+         enable_if_t<is_getable<BasicJsonType, T>::value, int> = 0>
+void from_json(const BasicJsonType& j, std::valarray<T>& l)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_array()))
+    {
+        JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(j.type_name()), j));
+    }
+    l.resize(j.size());
+    std::transform(j.begin(), j.end(), std::begin(l),
+                   [](const BasicJsonType & elem)
+    {
+        return elem.template get<T>();
+    });
+}
+
+template<typename BasicJsonType, typename T, std::size_t N>
+auto from_json(const BasicJsonType& j, T (&arr)[N])  // NOLINT(cppcoreguidelines-avoid-c-arrays,hicpp-avoid-c-arrays,modernize-avoid-c-arrays)
+-> decltype(j.template get<T>(), void())
+{
+    for (std::size_t i = 0; i < N; ++i)
+    {
+        arr[i] = j.at(i).template get<T>();
+    }
+}
+
+template<typename BasicJsonType>
+void from_json_array_impl(const BasicJsonType& j, typename BasicJsonType::array_t& arr, priority_tag<3> /*unused*/)
+{
+    arr = *j.template get_ptr<const typename BasicJsonType::array_t*>();
+}
+
+template<typename BasicJsonType, typename T, std::size_t N>
+auto from_json_array_impl(const BasicJsonType& j, std::array<T, N>& arr,
+                          priority_tag<2> /*unused*/)
+-> decltype(j.template get<T>(), void())
+{
+    for (std::size_t i = 0; i < N; ++i)
+    {
+        arr[i] = j.at(i).template get<T>();
+    }
+}
+
+template<typename BasicJsonType, typename ConstructibleArrayType,
+         enable_if_t<
+             std::is_assignable<ConstructibleArrayType&, ConstructibleArrayType>::value,
+             int> = 0>
+auto from_json_array_impl(const BasicJsonType& j, ConstructibleArrayType& arr, priority_tag<1> /*unused*/)
+-> decltype(
+    arr.reserve(std::declval<typename ConstructibleArrayType::size_type>()),
+    j.template get<typename ConstructibleArrayType::value_type>(),
+    void())
+{
+    using std::end;
+
+    ConstructibleArrayType ret;
+    ret.reserve(j.size());
+    std::transform(j.begin(), j.end(),
+                   std::inserter(ret, end(ret)), [](const BasicJsonType & i)
+    {
+        // get<BasicJsonType>() returns *this, this won't call a from_json
+        // method when value_type is BasicJsonType
+        return i.template get<typename ConstructibleArrayType::value_type>();
+    });
+    arr = std::move(ret);
+}
+
+template<typename BasicJsonType, typename ConstructibleArrayType,
+         enable_if_t<
+             std::is_assignable<ConstructibleArrayType&, ConstructibleArrayType>::value,
+             int> = 0>
+void from_json_array_impl(const BasicJsonType& j, ConstructibleArrayType& arr,
+                          priority_tag<0> /*unused*/)
+{
+    using std::end;
+
+    ConstructibleArrayType ret;
+    std::transform(
+        j.begin(), j.end(), std::inserter(ret, end(ret)),
+        [](const BasicJsonType & i)
+    {
+        // get<BasicJsonType>() returns *this, this won't call a from_json
+        // method when value_type is BasicJsonType
+        return i.template get<typename ConstructibleArrayType::value_type>();
+    });
+    arr = std::move(ret);
+}
+
+template < typename BasicJsonType, typename ConstructibleArrayType,
+           enable_if_t <
+               is_constructible_array_type<BasicJsonType, ConstructibleArrayType>::value&&
+               !is_constructible_object_type<BasicJsonType, ConstructibleArrayType>::value&&
+               !is_constructible_string_type<BasicJsonType, ConstructibleArrayType>::value&&
+               !std::is_same<ConstructibleArrayType, typename BasicJsonType::binary_t>::value&&
+               !is_basic_json<ConstructibleArrayType>::value,
+               int > = 0 >
+auto from_json(const BasicJsonType& j, ConstructibleArrayType& arr)
+-> decltype(from_json_array_impl(j, arr, priority_tag<3> {}),
+j.template get<typename ConstructibleArrayType::value_type>(),
+void())
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_array()))
+    {
+        JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(j.type_name()), j));
+    }
+
+    from_json_array_impl(j, arr, priority_tag<3> {});
+}
+
+template < typename BasicJsonType, typename T, std::size_t... Idx >
+std::array<T, sizeof...(Idx)> from_json_inplace_array_impl(BasicJsonType&& j,
+        identity_tag<std::array<T, sizeof...(Idx)>> /*unused*/, index_sequence<Idx...> /*unused*/)
+{
+    return { { std::forward<BasicJsonType>(j).at(Idx).template get<T>()... } };
+}
+
+template < typename BasicJsonType, typename T, std::size_t N >
+auto from_json(BasicJsonType&& j, identity_tag<std::array<T, N>> tag)
+-> decltype(from_json_inplace_array_impl(std::forward<BasicJsonType>(j), tag, make_index_sequence<N> {}))
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_array()))
+    {
+        JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(j.type_name()), j));
+    }
+
+    return from_json_inplace_array_impl(std::forward<BasicJsonType>(j), tag, make_index_sequence<N> {});
+}
+
+template<typename BasicJsonType>
+void from_json(const BasicJsonType& j, typename BasicJsonType::binary_t& bin)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_binary()))
+    {
+        JSON_THROW(type_error::create(302, "type must be binary, but is " + std::string(j.type_name()), j));
+    }
+
+    bin = *j.template get_ptr<const typename BasicJsonType::binary_t*>();
+}
+
+template<typename BasicJsonType, typename ConstructibleObjectType,
+         enable_if_t<is_constructible_object_type<BasicJsonType, ConstructibleObjectType>::value, int> = 0>
+void from_json(const BasicJsonType& j, ConstructibleObjectType& obj)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_object()))
+    {
+        JSON_THROW(type_error::create(302, "type must be object, but is " + std::string(j.type_name()), j));
+    }
+
+    ConstructibleObjectType ret;
+    const auto* inner_object = j.template get_ptr<const typename BasicJsonType::object_t*>();
+    using value_type = typename ConstructibleObjectType::value_type;
+    std::transform(
+        inner_object->begin(), inner_object->end(),
+        std::inserter(ret, ret.begin()),
+        [](typename BasicJsonType::object_t::value_type const & p)
+    {
+        return value_type(p.first, p.second.template get<typename ConstructibleObjectType::mapped_type>());
+    });
+    obj = std::move(ret);
+}
+
+// overload for arithmetic types, not chosen for basic_json template arguments
+// (BooleanType, etc..); note: Is it really necessary to provide explicit
+// overloads for boolean_t etc. in case of a custom BooleanType which is not
+// an arithmetic type?
+template < typename BasicJsonType, typename ArithmeticType,
+           enable_if_t <
+               std::is_arithmetic<ArithmeticType>::value&&
+               !std::is_same<ArithmeticType, typename BasicJsonType::number_unsigned_t>::value&&
+               !std::is_same<ArithmeticType, typename BasicJsonType::number_integer_t>::value&&
+               !std::is_same<ArithmeticType, typename BasicJsonType::number_float_t>::value&&
+               !std::is_same<ArithmeticType, typename BasicJsonType::boolean_t>::value,
+               int > = 0 >
+void from_json(const BasicJsonType& j, ArithmeticType& val)
+{
+    switch (static_cast<value_t>(j))
+    {
+        case value_t::number_unsigned:
+        {
+            val = static_cast<ArithmeticType>(*j.template get_ptr<const typename BasicJsonType::number_unsigned_t*>());
+            break;
+        }
+        case value_t::number_integer:
+        {
+            val = static_cast<ArithmeticType>(*j.template get_ptr<const typename BasicJsonType::number_integer_t*>());
+            break;
+        }
+        case value_t::number_float:
+        {
+            val = static_cast<ArithmeticType>(*j.template get_ptr<const typename BasicJsonType::number_float_t*>());
+            break;
+        }
+        case value_t::boolean:
+        {
+            val = static_cast<ArithmeticType>(*j.template get_ptr<const typename BasicJsonType::boolean_t*>());
+            break;
+        }
+
+        case value_t::null:
+        case value_t::object:
+        case value_t::array:
+        case value_t::string:
+        case value_t::binary:
+        case value_t::discarded:
+        default:
+            JSON_THROW(type_error::create(302, "type must be number, but is " + std::string(j.type_name()), j));
+    }
+}
+
+template<typename BasicJsonType, typename... Args, std::size_t... Idx>
+std::tuple<Args...> from_json_tuple_impl_base(BasicJsonType&& j, index_sequence<Idx...> /*unused*/)
+{
+    return std::make_tuple(std::forward<BasicJsonType>(j).at(Idx).template get<Args>()...);
+}
+
+template < typename BasicJsonType, class A1, class A2 >
+std::pair<A1, A2> from_json_tuple_impl(BasicJsonType&& j, identity_tag<std::pair<A1, A2>> /*unused*/, priority_tag<0> /*unused*/)
+{
+    return {std::forward<BasicJsonType>(j).at(0).template get<A1>(),
+            std::forward<BasicJsonType>(j).at(1).template get<A2>()};
+}
+
+template<typename BasicJsonType, typename A1, typename A2>
+void from_json_tuple_impl(BasicJsonType&& j, std::pair<A1, A2>& p, priority_tag<1> /*unused*/)
+{
+    p = from_json_tuple_impl(std::forward<BasicJsonType>(j), identity_tag<std::pair<A1, A2>> {}, priority_tag<0> {});
+}
+
+template<typename BasicJsonType, typename... Args>
+std::tuple<Args...> from_json_tuple_impl(BasicJsonType&& j, identity_tag<std::tuple<Args...>> /*unused*/, priority_tag<2> /*unused*/)
+{
+    return from_json_tuple_impl_base<BasicJsonType, Args...>(std::forward<BasicJsonType>(j), index_sequence_for<Args...> {});
+}
+
+template<typename BasicJsonType, typename... Args>
+void from_json_tuple_impl(BasicJsonType&& j, std::tuple<Args...>& t, priority_tag<3> /*unused*/)
+{
+    t = from_json_tuple_impl_base<BasicJsonType, Args...>(std::forward<BasicJsonType>(j), index_sequence_for<Args...> {});
+}
+
+template<typename BasicJsonType, typename TupleRelated>
+auto from_json(BasicJsonType&& j, TupleRelated&& t)
+-> decltype(from_json_tuple_impl(std::forward<BasicJsonType>(j), std::forward<TupleRelated>(t), priority_tag<3> {}))
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_array()))
+    {
+        JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(j.type_name()), j));
+    }
+
+    return from_json_tuple_impl(std::forward<BasicJsonType>(j), std::forward<TupleRelated>(t), priority_tag<3> {});
+}
+
+template < typename BasicJsonType, typename Key, typename Value, typename Compare, typename Allocator,
+           typename = enable_if_t < !std::is_constructible <
+                                        typename BasicJsonType::string_t, Key >::value >>
+void from_json(const BasicJsonType& j, std::map<Key, Value, Compare, Allocator>& m)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_array()))
+    {
+        JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(j.type_name()), j));
+    }
+    m.clear();
+    for (const auto& p : j)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!p.is_array()))
+        {
+            JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(p.type_name()), j));
+        }
+        m.emplace(p.at(0).template get<Key>(), p.at(1).template get<Value>());
+    }
+}
+
+template < typename BasicJsonType, typename Key, typename Value, typename Hash, typename KeyEqual, typename Allocator,
+           typename = enable_if_t < !std::is_constructible <
+                                        typename BasicJsonType::string_t, Key >::value >>
+void from_json(const BasicJsonType& j, std::unordered_map<Key, Value, Hash, KeyEqual, Allocator>& m)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_array()))
+    {
+        JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(j.type_name()), j));
+    }
+    m.clear();
+    for (const auto& p : j)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!p.is_array()))
+        {
+            JSON_THROW(type_error::create(302, "type must be array, but is " + std::string(p.type_name()), j));
+        }
+        m.emplace(p.at(0).template get<Key>(), p.at(1).template get<Value>());
+    }
+}
+
+#if JSON_HAS_FILESYSTEM || JSON_HAS_EXPERIMENTAL_FILESYSTEM
+template<typename BasicJsonType>
+void from_json(const BasicJsonType& j, std_fs::path& p)
+{
+    if (JSON_HEDLEY_UNLIKELY(!j.is_string()))
+    {
+        JSON_THROW(type_error::create(302, "type must be string, but is " + std::string(j.type_name()), j));
+    }
+    p = *j.template get_ptr<const typename BasicJsonType::string_t*>();
+}
+#endif
+
+struct from_json_fn
+{
+    template<typename BasicJsonType, typename T>
+    auto operator()(const BasicJsonType& j, T&& val) const
+    noexcept(noexcept(from_json(j, std::forward<T>(val))))
+    -> decltype(from_json(j, std::forward<T>(val)))
+    {
+        return from_json(j, std::forward<T>(val));
+    }
+};
+}  // namespace detail
+
+/// namespace to hold default `from_json` function
+/// to see why this is required:
+/// http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/n4381.html
+namespace // NOLINT(cert-dcl59-cpp,fuchsia-header-anon-namespaces,google-build-namespaces)
+{
+constexpr const auto& from_json = detail::static_const<detail::from_json_fn>::value; // NOLINT(misc-definitions-in-headers)
+} // namespace
+} // namespace nlohmann
+
+// #include <nlohmann/detail/conversions/to_json.hpp>
+
+
+#include <algorithm> // copy
+#include <iterator> // begin, end
+#include <string> // string
+#include <tuple> // tuple, get
+#include <type_traits> // is_same, is_constructible, is_floating_point, is_enum, underlying_type
+#include <utility> // move, forward, declval, pair
+#include <valarray> // valarray
+#include <vector> // vector
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/iterators/iteration_proxy.hpp>
+
+
+#include <cstddef> // size_t
+#include <iterator> // input_iterator_tag
+#include <string> // string, to_string
+#include <tuple> // tuple_size, get, tuple_element
+#include <utility> // move
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+template<typename string_type>
+void int_to_string( string_type& target, std::size_t value )
+{
+    // For ADL
+    using std::to_string;
+    target = to_string(value);
+}
+template<typename IteratorType> class iteration_proxy_value
+{
+  public:
+    using difference_type = std::ptrdiff_t;
+    using value_type = iteration_proxy_value;
+    using pointer = value_type * ;
+    using reference = value_type & ;
+    using iterator_category = std::input_iterator_tag;
+    using string_type = typename std::remove_cv< typename std::remove_reference<decltype( std::declval<IteratorType>().key() ) >::type >::type;
+
+  private:
+    /// the iterator
+    IteratorType anchor;
+    /// an index for arrays (used to create key names)
+    std::size_t array_index = 0;
+    /// last stringified array index
+    mutable std::size_t array_index_last = 0;
+    /// a string representation of the array index
+    mutable string_type array_index_str = "0";
+    /// an empty string (to return a reference for primitive values)
+    const string_type empty_str{};
+
+  public:
+    explicit iteration_proxy_value(IteratorType it) noexcept
+        : anchor(std::move(it))
+    {}
+
+    /// dereference operator (needed for range-based for)
+    iteration_proxy_value& operator*()
+    {
+        return *this;
+    }
+
+    /// increment operator (needed for range-based for)
+    iteration_proxy_value& operator++()
+    {
+        ++anchor;
+        ++array_index;
+
+        return *this;
+    }
+
+    /// equality operator (needed for InputIterator)
+    bool operator==(const iteration_proxy_value& o) const
+    {
+        return anchor == o.anchor;
+    }
+
+    /// inequality operator (needed for range-based for)
+    bool operator!=(const iteration_proxy_value& o) const
+    {
+        return anchor != o.anchor;
+    }
+
+    /// return key of the iterator
+    const string_type& key() const
+    {
+        JSON_ASSERT(anchor.m_object != nullptr);
+
+        switch (anchor.m_object->type())
+        {
+            // use integer array index as key
+            case value_t::array:
+            {
+                if (array_index != array_index_last)
+                {
+                    int_to_string( array_index_str, array_index );
+                    array_index_last = array_index;
+                }
+                return array_index_str;
+            }
+
+            // use key from the object
+            case value_t::object:
+                return anchor.key();
+
+            // use an empty key for all primitive types
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+                return empty_str;
+        }
+    }
+
+    /// return value of the iterator
+    typename IteratorType::reference value() const
+    {
+        return anchor.value();
+    }
+};
+
+/// proxy class for the items() function
+template<typename IteratorType> class iteration_proxy
+{
+  private:
+    /// the container to iterate
+    typename IteratorType::reference container;
+
+  public:
+    /// construct iteration proxy from a container
+    explicit iteration_proxy(typename IteratorType::reference cont) noexcept
+        : container(cont) {}
+
+    /// return iterator begin (needed for range-based for)
+    iteration_proxy_value<IteratorType> begin() noexcept
+    {
+        return iteration_proxy_value<IteratorType>(container.begin());
+    }
+
+    /// return iterator end (needed for range-based for)
+    iteration_proxy_value<IteratorType> end() noexcept
+    {
+        return iteration_proxy_value<IteratorType>(container.end());
+    }
+};
+// Structured Bindings Support
+// For further reference see https://blog.tartanllama.xyz/structured-bindings/
+// And see https://github.com/nlohmann/json/pull/1391
+template<std::size_t N, typename IteratorType, enable_if_t<N == 0, int> = 0>
+auto get(const nlohmann::detail::iteration_proxy_value<IteratorType>& i) -> decltype(i.key())
+{
+    return i.key();
+}
+// Structured Bindings Support
+// For further reference see https://blog.tartanllama.xyz/structured-bindings/
+// And see https://github.com/nlohmann/json/pull/1391
+template<std::size_t N, typename IteratorType, enable_if_t<N == 1, int> = 0>
+auto get(const nlohmann::detail::iteration_proxy_value<IteratorType>& i) -> decltype(i.value())
+{
+    return i.value();
+}
+}  // namespace detail
+}  // namespace nlohmann
+
+// The Addition to the STD Namespace is required to add
+// Structured Bindings Support to the iteration_proxy_value class
+// For further reference see https://blog.tartanllama.xyz/structured-bindings/
+// And see https://github.com/nlohmann/json/pull/1391
+namespace std
+{
+#if defined(__clang__)
+    // Fix: https://github.com/nlohmann/json/issues/1401
+    #pragma clang diagnostic push
+    #pragma clang diagnostic ignored "-Wmismatched-tags"
+#endif
+template<typename IteratorType>
+class tuple_size<::nlohmann::detail::iteration_proxy_value<IteratorType>>
+            : public std::integral_constant<std::size_t, 2> {};
+
+template<std::size_t N, typename IteratorType>
+class tuple_element<N, ::nlohmann::detail::iteration_proxy_value<IteratorType >>
+{
+  public:
+    using type = decltype(
+                     get<N>(std::declval <
+                            ::nlohmann::detail::iteration_proxy_value<IteratorType >> ()));
+};
+#if defined(__clang__)
+    #pragma clang diagnostic pop
+#endif
+} // namespace std
+
+// #include <nlohmann/detail/meta/cpp_future.hpp>
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+#if JSON_HAS_EXPERIMENTAL_FILESYSTEM
+#include <experimental/filesystem>
+namespace nlohmann::detail
+{
+namespace std_fs = std::experimental::filesystem;
+} // namespace nlohmann::detail
+#elif JSON_HAS_FILESYSTEM
+#include <filesystem>
+namespace nlohmann::detail
+{
+namespace std_fs = std::filesystem;
+} // namespace nlohmann::detail
+#endif
+
+namespace nlohmann
+{
+namespace detail
+{
+//////////////////
+// constructors //
+//////////////////
+
+/*
+ * Note all external_constructor<>::construct functions need to call
+ * j.m_value.destroy(j.m_type) to avoid a memory leak in case j contains an
+ * allocated value (e.g., a string). See bug issue
+ * https://github.com/nlohmann/json/issues/2865 for more information.
+ */
+
+template<value_t> struct external_constructor;
+
+template<>
+struct external_constructor<value_t::boolean>
+{
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, typename BasicJsonType::boolean_t b) noexcept
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::boolean;
+        j.m_value = b;
+        j.assert_invariant();
+    }
+};
+
+template<>
+struct external_constructor<value_t::string>
+{
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, const typename BasicJsonType::string_t& s)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::string;
+        j.m_value = s;
+        j.assert_invariant();
+    }
+
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, typename BasicJsonType::string_t&& s)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::string;
+        j.m_value = std::move(s);
+        j.assert_invariant();
+    }
+
+    template < typename BasicJsonType, typename CompatibleStringType,
+               enable_if_t < !std::is_same<CompatibleStringType, typename BasicJsonType::string_t>::value,
+                             int > = 0 >
+    static void construct(BasicJsonType& j, const CompatibleStringType& str)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::string;
+        j.m_value.string = j.template create<typename BasicJsonType::string_t>(str);
+        j.assert_invariant();
+    }
+};
+
+template<>
+struct external_constructor<value_t::binary>
+{
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, const typename BasicJsonType::binary_t& b)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::binary;
+        j.m_value = typename BasicJsonType::binary_t(b);
+        j.assert_invariant();
+    }
+
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, typename BasicJsonType::binary_t&& b)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::binary;
+        j.m_value = typename BasicJsonType::binary_t(std::move(b));
+        j.assert_invariant();
+    }
+};
+
+template<>
+struct external_constructor<value_t::number_float>
+{
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, typename BasicJsonType::number_float_t val) noexcept
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::number_float;
+        j.m_value = val;
+        j.assert_invariant();
+    }
+};
+
+template<>
+struct external_constructor<value_t::number_unsigned>
+{
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, typename BasicJsonType::number_unsigned_t val) noexcept
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::number_unsigned;
+        j.m_value = val;
+        j.assert_invariant();
+    }
+};
+
+template<>
+struct external_constructor<value_t::number_integer>
+{
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, typename BasicJsonType::number_integer_t val) noexcept
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::number_integer;
+        j.m_value = val;
+        j.assert_invariant();
+    }
+};
+
+template<>
+struct external_constructor<value_t::array>
+{
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, const typename BasicJsonType::array_t& arr)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::array;
+        j.m_value = arr;
+        j.set_parents();
+        j.assert_invariant();
+    }
+
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, typename BasicJsonType::array_t&& arr)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::array;
+        j.m_value = std::move(arr);
+        j.set_parents();
+        j.assert_invariant();
+    }
+
+    template < typename BasicJsonType, typename CompatibleArrayType,
+               enable_if_t < !std::is_same<CompatibleArrayType, typename BasicJsonType::array_t>::value,
+                             int > = 0 >
+    static void construct(BasicJsonType& j, const CompatibleArrayType& arr)
+    {
+        using std::begin;
+        using std::end;
+
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::array;
+        j.m_value.array = j.template create<typename BasicJsonType::array_t>(begin(arr), end(arr));
+        j.set_parents();
+        j.assert_invariant();
+    }
+
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, const std::vector<bool>& arr)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::array;
+        j.m_value = value_t::array;
+        j.m_value.array->reserve(arr.size());
+        for (const bool x : arr)
+        {
+            j.m_value.array->push_back(x);
+            j.set_parent(j.m_value.array->back());
+        }
+        j.assert_invariant();
+    }
+
+    template<typename BasicJsonType, typename T,
+             enable_if_t<std::is_convertible<T, BasicJsonType>::value, int> = 0>
+    static void construct(BasicJsonType& j, const std::valarray<T>& arr)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::array;
+        j.m_value = value_t::array;
+        j.m_value.array->resize(arr.size());
+        if (arr.size() > 0)
+        {
+            std::copy(std::begin(arr), std::end(arr), j.m_value.array->begin());
+        }
+        j.set_parents();
+        j.assert_invariant();
+    }
+};
+
+template<>
+struct external_constructor<value_t::object>
+{
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, const typename BasicJsonType::object_t& obj)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::object;
+        j.m_value = obj;
+        j.set_parents();
+        j.assert_invariant();
+    }
+
+    template<typename BasicJsonType>
+    static void construct(BasicJsonType& j, typename BasicJsonType::object_t&& obj)
+    {
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::object;
+        j.m_value = std::move(obj);
+        j.set_parents();
+        j.assert_invariant();
+    }
+
+    template < typename BasicJsonType, typename CompatibleObjectType,
+               enable_if_t < !std::is_same<CompatibleObjectType, typename BasicJsonType::object_t>::value, int > = 0 >
+    static void construct(BasicJsonType& j, const CompatibleObjectType& obj)
+    {
+        using std::begin;
+        using std::end;
+
+        j.m_value.destroy(j.m_type);
+        j.m_type = value_t::object;
+        j.m_value.object = j.template create<typename BasicJsonType::object_t>(begin(obj), end(obj));
+        j.set_parents();
+        j.assert_invariant();
+    }
+};
+
+/////////////
+// to_json //
+/////////////
+
+template<typename BasicJsonType, typename T,
+         enable_if_t<std::is_same<T, typename BasicJsonType::boolean_t>::value, int> = 0>
+void to_json(BasicJsonType& j, T b) noexcept
+{
+    external_constructor<value_t::boolean>::construct(j, b);
+}
+
+template<typename BasicJsonType, typename CompatibleString,
+         enable_if_t<std::is_constructible<typename BasicJsonType::string_t, CompatibleString>::value, int> = 0>
+void to_json(BasicJsonType& j, const CompatibleString& s)
+{
+    external_constructor<value_t::string>::construct(j, s);
+}
+
+template<typename BasicJsonType>
+void to_json(BasicJsonType& j, typename BasicJsonType::string_t&& s)
+{
+    external_constructor<value_t::string>::construct(j, std::move(s));
+}
+
+template<typename BasicJsonType, typename FloatType,
+         enable_if_t<std::is_floating_point<FloatType>::value, int> = 0>
+void to_json(BasicJsonType& j, FloatType val) noexcept
+{
+    external_constructor<value_t::number_float>::construct(j, static_cast<typename BasicJsonType::number_float_t>(val));
+}
+
+template<typename BasicJsonType, typename CompatibleNumberUnsignedType,
+         enable_if_t<is_compatible_integer_type<typename BasicJsonType::number_unsigned_t, CompatibleNumberUnsignedType>::value, int> = 0>
+void to_json(BasicJsonType& j, CompatibleNumberUnsignedType val) noexcept
+{
+    external_constructor<value_t::number_unsigned>::construct(j, static_cast<typename BasicJsonType::number_unsigned_t>(val));
+}
+
+template<typename BasicJsonType, typename CompatibleNumberIntegerType,
+         enable_if_t<is_compatible_integer_type<typename BasicJsonType::number_integer_t, CompatibleNumberIntegerType>::value, int> = 0>
+void to_json(BasicJsonType& j, CompatibleNumberIntegerType val) noexcept
+{
+    external_constructor<value_t::number_integer>::construct(j, static_cast<typename BasicJsonType::number_integer_t>(val));
+}
+
+template<typename BasicJsonType, typename EnumType,
+         enable_if_t<std::is_enum<EnumType>::value, int> = 0>
+void to_json(BasicJsonType& j, EnumType e) noexcept
+{
+    using underlying_type = typename std::underlying_type<EnumType>::type;
+    external_constructor<value_t::number_integer>::construct(j, static_cast<underlying_type>(e));
+}
+
+template<typename BasicJsonType>
+void to_json(BasicJsonType& j, const std::vector<bool>& e)
+{
+    external_constructor<value_t::array>::construct(j, e);
+}
+
+template < typename BasicJsonType, typename CompatibleArrayType,
+           enable_if_t < is_compatible_array_type<BasicJsonType,
+                         CompatibleArrayType>::value&&
+                         !is_compatible_object_type<BasicJsonType, CompatibleArrayType>::value&&
+                         !is_compatible_string_type<BasicJsonType, CompatibleArrayType>::value&&
+                         !std::is_same<typename BasicJsonType::binary_t, CompatibleArrayType>::value&&
+                         !is_basic_json<CompatibleArrayType>::value,
+                         int > = 0 >
+void to_json(BasicJsonType& j, const CompatibleArrayType& arr)
+{
+    external_constructor<value_t::array>::construct(j, arr);
+}
+
+template<typename BasicJsonType>
+void to_json(BasicJsonType& j, const typename BasicJsonType::binary_t& bin)
+{
+    external_constructor<value_t::binary>::construct(j, bin);
+}
+
+template<typename BasicJsonType, typename T,
+         enable_if_t<std::is_convertible<T, BasicJsonType>::value, int> = 0>
+void to_json(BasicJsonType& j, const std::valarray<T>& arr)
+{
+    external_constructor<value_t::array>::construct(j, std::move(arr));
+}
+
+template<typename BasicJsonType>
+void to_json(BasicJsonType& j, typename BasicJsonType::array_t&& arr)
+{
+    external_constructor<value_t::array>::construct(j, std::move(arr));
+}
+
+template < typename BasicJsonType, typename CompatibleObjectType,
+           enable_if_t < is_compatible_object_type<BasicJsonType, CompatibleObjectType>::value&& !is_basic_json<CompatibleObjectType>::value, int > = 0 >
+void to_json(BasicJsonType& j, const CompatibleObjectType& obj)
+{
+    external_constructor<value_t::object>::construct(j, obj);
+}
+
+template<typename BasicJsonType>
+void to_json(BasicJsonType& j, typename BasicJsonType::object_t&& obj)
+{
+    external_constructor<value_t::object>::construct(j, std::move(obj));
+}
+
+template <
+    typename BasicJsonType, typename T, std::size_t N,
+    enable_if_t < !std::is_constructible<typename BasicJsonType::string_t,
+                  const T(&)[N]>::value, // NOLINT(cppcoreguidelines-avoid-c-arrays,hicpp-avoid-c-arrays,modernize-avoid-c-arrays)
+                  int > = 0 >
+void to_json(BasicJsonType& j, const T(&arr)[N]) // NOLINT(cppcoreguidelines-avoid-c-arrays,hicpp-avoid-c-arrays,modernize-avoid-c-arrays)
+{
+    external_constructor<value_t::array>::construct(j, arr);
+}
+
+template < typename BasicJsonType, typename T1, typename T2, enable_if_t < std::is_constructible<BasicJsonType, T1>::value&& std::is_constructible<BasicJsonType, T2>::value, int > = 0 >
+void to_json(BasicJsonType& j, const std::pair<T1, T2>& p)
+{
+    j = { p.first, p.second };
+}
+
+// for https://github.com/nlohmann/json/pull/1134
+template<typename BasicJsonType, typename T,
+         enable_if_t<std::is_same<T, iteration_proxy_value<typename BasicJsonType::iterator>>::value, int> = 0>
+void to_json(BasicJsonType& j, const T& b)
+{
+    j = { {b.key(), b.value()} };
+}
+
+template<typename BasicJsonType, typename Tuple, std::size_t... Idx>
+void to_json_tuple_impl(BasicJsonType& j, const Tuple& t, index_sequence<Idx...> /*unused*/)
+{
+    j = { std::get<Idx>(t)... };
+}
+
+template<typename BasicJsonType, typename T, enable_if_t<is_constructible_tuple<BasicJsonType, T>::value, int > = 0>
+void to_json(BasicJsonType& j, const T& t)
+{
+    to_json_tuple_impl(j, t, make_index_sequence<std::tuple_size<T>::value> {});
+}
+
+#if JSON_HAS_FILESYSTEM || JSON_HAS_EXPERIMENTAL_FILESYSTEM
+template<typename BasicJsonType>
+void to_json(BasicJsonType& j, const std_fs::path& p)
+{
+    j = p.string();
+}
+#endif
+
+struct to_json_fn
+{
+    template<typename BasicJsonType, typename T>
+    auto operator()(BasicJsonType& j, T&& val) const noexcept(noexcept(to_json(j, std::forward<T>(val))))
+    -> decltype(to_json(j, std::forward<T>(val)), void())
+    {
+        return to_json(j, std::forward<T>(val));
+    }
+};
+}  // namespace detail
+
+/// namespace to hold default `to_json` function
+/// to see why this is required:
+/// http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/n4381.html
+namespace // NOLINT(cert-dcl59-cpp,fuchsia-header-anon-namespaces,google-build-namespaces)
+{
+constexpr const auto& to_json = detail::static_const<detail::to_json_fn>::value; // NOLINT(misc-definitions-in-headers)
+} // namespace
+} // namespace nlohmann
+
+// #include <nlohmann/detail/meta/identity_tag.hpp>
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+
+namespace nlohmann
+{
+
+/// @sa https://json.nlohmann.me/api/adl_serializer/
+template<typename ValueType, typename>
+struct adl_serializer
+{
+    /// @brief convert a JSON value to any value type
+    /// @sa https://json.nlohmann.me/api/adl_serializer/from_json/
+    template<typename BasicJsonType, typename TargetType = ValueType>
+    static auto from_json(BasicJsonType && j, TargetType& val) noexcept(
+        noexcept(::nlohmann::from_json(std::forward<BasicJsonType>(j), val)))
+    -> decltype(::nlohmann::from_json(std::forward<BasicJsonType>(j), val), void())
+    {
+        ::nlohmann::from_json(std::forward<BasicJsonType>(j), val);
+    }
+
+    /// @brief convert a JSON value to any value type
+    /// @sa https://json.nlohmann.me/api/adl_serializer/from_json/
+    template<typename BasicJsonType, typename TargetType = ValueType>
+    static auto from_json(BasicJsonType && j) noexcept(
+    noexcept(::nlohmann::from_json(std::forward<BasicJsonType>(j), detail::identity_tag<TargetType> {})))
+    -> decltype(::nlohmann::from_json(std::forward<BasicJsonType>(j), detail::identity_tag<TargetType> {}))
+    {
+        return ::nlohmann::from_json(std::forward<BasicJsonType>(j), detail::identity_tag<TargetType> {});
+    }
+
+    /// @brief convert any value type to a JSON value
+    /// @sa https://json.nlohmann.me/api/adl_serializer/to_json/
+    template<typename BasicJsonType, typename TargetType = ValueType>
+    static auto to_json(BasicJsonType& j, TargetType && val) noexcept(
+        noexcept(::nlohmann::to_json(j, std::forward<TargetType>(val))))
+    -> decltype(::nlohmann::to_json(j, std::forward<TargetType>(val)), void())
+    {
+        ::nlohmann::to_json(j, std::forward<TargetType>(val));
+    }
+};
+}  // namespace nlohmann
+
+// #include <nlohmann/byte_container_with_subtype.hpp>
+
+
+#include <cstdint> // uint8_t, uint64_t
+#include <tuple> // tie
+#include <utility> // move
+
+namespace nlohmann
+{
+
+/// @brief an internal type for a backed binary type
+/// @sa https://json.nlohmann.me/api/byte_container_with_subtype/
+template<typename BinaryType>
+class byte_container_with_subtype : public BinaryType
+{
+  public:
+    using container_type = BinaryType;
+    using subtype_type = std::uint64_t;
+
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/byte_container_with_subtype/
+    byte_container_with_subtype() noexcept(noexcept(container_type()))
+        : container_type()
+    {}
+
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/byte_container_with_subtype/
+    byte_container_with_subtype(const container_type& b) noexcept(noexcept(container_type(b)))
+        : container_type(b)
+    {}
+
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/byte_container_with_subtype/
+    byte_container_with_subtype(container_type&& b) noexcept(noexcept(container_type(std::move(b))))
+        : container_type(std::move(b))
+    {}
+
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/byte_container_with_subtype/
+    byte_container_with_subtype(const container_type& b, subtype_type subtype_) noexcept(noexcept(container_type(b)))
+        : container_type(b)
+        , m_subtype(subtype_)
+        , m_has_subtype(true)
+    {}
+
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/byte_container_with_subtype/
+    byte_container_with_subtype(container_type&& b, subtype_type subtype_) noexcept(noexcept(container_type(std::move(b))))
+        : container_type(std::move(b))
+        , m_subtype(subtype_)
+        , m_has_subtype(true)
+    {}
+
+    bool operator==(const byte_container_with_subtype& rhs) const
+    {
+        return std::tie(static_cast<const BinaryType&>(*this), m_subtype, m_has_subtype) ==
+               std::tie(static_cast<const BinaryType&>(rhs), rhs.m_subtype, rhs.m_has_subtype);
+    }
+
+    bool operator!=(const byte_container_with_subtype& rhs) const
+    {
+        return !(rhs == *this);
+    }
+
+    /// @brief sets the binary subtype
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/set_subtype/
+    void set_subtype(subtype_type subtype_) noexcept
+    {
+        m_subtype = subtype_;
+        m_has_subtype = true;
+    }
+
+    /// @brief return the binary subtype
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/subtype/
+    constexpr subtype_type subtype() const noexcept
+    {
+        return m_has_subtype ? m_subtype : static_cast<subtype_type>(-1);
+    }
+
+    /// @brief return whether the value has a subtype
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/has_subtype/
+    constexpr bool has_subtype() const noexcept
+    {
+        return m_has_subtype;
+    }
+
+    /// @brief clears the binary subtype
+    /// @sa https://json.nlohmann.me/api/byte_container_with_subtype/clear_subtype/
+    void clear_subtype() noexcept
+    {
+        m_subtype = 0;
+        m_has_subtype = false;
+    }
+
+  private:
+    subtype_type m_subtype = 0;
+    bool m_has_subtype = false;
+};
+
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/conversions/from_json.hpp>
+
+// #include <nlohmann/detail/conversions/to_json.hpp>
+
+// #include <nlohmann/detail/exceptions.hpp>
+
+// #include <nlohmann/detail/hash.hpp>
+
+
+#include <cstdint> // uint8_t
+#include <cstddef> // size_t
+#include <functional> // hash
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+
+// boost::hash_combine
+inline std::size_t combine(std::size_t seed, std::size_t h) noexcept
+{
+    seed ^= h + 0x9e3779b9 + (seed << 6U) + (seed >> 2U);
+    return seed;
+}
+
+/*!
+@brief hash a JSON value
+
+The hash function tries to rely on std::hash where possible. Furthermore, the
+type of the JSON value is taken into account to have different hash values for
+null, 0, 0U, and false, etc.
+
+@tparam BasicJsonType basic_json specialization
+@param j JSON value to hash
+@return hash value of j
+*/
+template<typename BasicJsonType>
+std::size_t hash(const BasicJsonType& j)
+{
+    using string_t = typename BasicJsonType::string_t;
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+
+    const auto type = static_cast<std::size_t>(j.type());
+    switch (j.type())
+    {
+        case BasicJsonType::value_t::null:
+        case BasicJsonType::value_t::discarded:
+        {
+            return combine(type, 0);
+        }
+
+        case BasicJsonType::value_t::object:
+        {
+            auto seed = combine(type, j.size());
+            for (const auto& element : j.items())
+            {
+                const auto h = std::hash<string_t> {}(element.key());
+                seed = combine(seed, h);
+                seed = combine(seed, hash(element.value()));
+            }
+            return seed;
+        }
+
+        case BasicJsonType::value_t::array:
+        {
+            auto seed = combine(type, j.size());
+            for (const auto& element : j)
+            {
+                seed = combine(seed, hash(element));
+            }
+            return seed;
+        }
+
+        case BasicJsonType::value_t::string:
+        {
+            const auto h = std::hash<string_t> {}(j.template get_ref<const string_t&>());
+            return combine(type, h);
+        }
+
+        case BasicJsonType::value_t::boolean:
+        {
+            const auto h = std::hash<bool> {}(j.template get<bool>());
+            return combine(type, h);
+        }
+
+        case BasicJsonType::value_t::number_integer:
+        {
+            const auto h = std::hash<number_integer_t> {}(j.template get<number_integer_t>());
+            return combine(type, h);
+        }
+
+        case BasicJsonType::value_t::number_unsigned:
+        {
+            const auto h = std::hash<number_unsigned_t> {}(j.template get<number_unsigned_t>());
+            return combine(type, h);
+        }
+
+        case BasicJsonType::value_t::number_float:
+        {
+            const auto h = std::hash<number_float_t> {}(j.template get<number_float_t>());
+            return combine(type, h);
+        }
+
+        case BasicJsonType::value_t::binary:
+        {
+            auto seed = combine(type, j.get_binary().size());
+            const auto h = std::hash<bool> {}(j.get_binary().has_subtype());
+            seed = combine(seed, h);
+            seed = combine(seed, static_cast<std::size_t>(j.get_binary().subtype()));
+            for (const auto byte : j.get_binary())
+            {
+                seed = combine(seed, std::hash<std::uint8_t> {}(byte));
+            }
+            return seed;
+        }
+
+        default:                   // LCOV_EXCL_LINE
+            JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+            return 0;              // LCOV_EXCL_LINE
+    }
+}
+
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/input/binary_reader.hpp>
+
+
+#include <algorithm> // generate_n
+#include <array> // array
+#include <cmath> // ldexp
+#include <cstddef> // size_t
+#include <cstdint> // uint8_t, uint16_t, uint32_t, uint64_t
+#include <cstdio> // snprintf
+#include <cstring> // memcpy
+#include <iterator> // back_inserter
+#include <limits> // numeric_limits
+#include <string> // char_traits, string
+#include <utility> // make_pair, move
+#include <vector> // vector
+
+// #include <nlohmann/detail/exceptions.hpp>
+
+// #include <nlohmann/detail/input/input_adapters.hpp>
+
+
+#include <array> // array
+#include <cstddef> // size_t
+#include <cstring> // strlen
+#include <iterator> // begin, end, iterator_traits, random_access_iterator_tag, distance, next
+#include <memory> // shared_ptr, make_shared, addressof
+#include <numeric> // accumulate
+#include <string> // string, char_traits
+#include <type_traits> // enable_if, is_base_of, is_pointer, is_integral, remove_pointer
+#include <utility> // pair, declval
+
+#ifndef JSON_NO_IO
+    #include <cstdio>   // FILE *
+    #include <istream>  // istream
+#endif                  // JSON_NO_IO
+
+// #include <nlohmann/detail/iterators/iterator_traits.hpp>
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+/// the supported input formats
+enum class input_format_t { json, cbor, msgpack, ubjson, bson };
+
+////////////////////
+// input adapters //
+////////////////////
+
+#ifndef JSON_NO_IO
+/*!
+Input adapter for stdio file access. This adapter read only 1 byte and do not use any
+ buffer. This adapter is a very low level adapter.
+*/
+class file_input_adapter
+{
+  public:
+    using char_type = char;
+
+    JSON_HEDLEY_NON_NULL(2)
+    explicit file_input_adapter(std::FILE* f) noexcept
+        : m_file(f)
+    {}
+
+    // make class move-only
+    file_input_adapter(const file_input_adapter&) = delete;
+    file_input_adapter(file_input_adapter&&) noexcept = default;
+    file_input_adapter& operator=(const file_input_adapter&) = delete;
+    file_input_adapter& operator=(file_input_adapter&&) = delete;
+    ~file_input_adapter() = default;
+
+    std::char_traits<char>::int_type get_character() noexcept
+    {
+        return std::fgetc(m_file);
+    }
+
+  private:
+    /// the file pointer to read from
+    std::FILE* m_file;
+};
+
+
+/*!
+Input adapter for a (caching) istream. Ignores a UFT Byte Order Mark at
+beginning of input. Does not support changing the underlying std::streambuf
+in mid-input. Maintains underlying std::istream and std::streambuf to support
+subsequent use of standard std::istream operations to process any input
+characters following those used in parsing the JSON input.  Clears the
+std::istream flags; any input errors (e.g., EOF) will be detected by the first
+subsequent call for input from the std::istream.
+*/
+class input_stream_adapter
+{
+  public:
+    using char_type = char;
+
+    ~input_stream_adapter()
+    {
+        // clear stream flags; we use underlying streambuf I/O, do not
+        // maintain ifstream flags, except eof
+        if (is != nullptr)
+        {
+            is->clear(is->rdstate() & std::ios::eofbit);
+        }
+    }
+
+    explicit input_stream_adapter(std::istream& i)
+        : is(&i), sb(i.rdbuf())
+    {}
+
+    // delete because of pointer members
+    input_stream_adapter(const input_stream_adapter&) = delete;
+    input_stream_adapter& operator=(input_stream_adapter&) = delete;
+    input_stream_adapter& operator=(input_stream_adapter&&) = delete;
+
+    input_stream_adapter(input_stream_adapter&& rhs) noexcept
+        : is(rhs.is), sb(rhs.sb)
+    {
+        rhs.is = nullptr;
+        rhs.sb = nullptr;
+    }
+
+    // std::istream/std::streambuf use std::char_traits<char>::to_int_type, to
+    // ensure that std::char_traits<char>::eof() and the character 0xFF do not
+    // end up as the same value, e.g. 0xFFFFFFFF.
+    std::char_traits<char>::int_type get_character()
+    {
+        auto res = sb->sbumpc();
+        // set eof manually, as we don't use the istream interface.
+        if (JSON_HEDLEY_UNLIKELY(res == std::char_traits<char>::eof()))
+        {
+            is->clear(is->rdstate() | std::ios::eofbit);
+        }
+        return res;
+    }
+
+  private:
+    /// the associated input stream
+    std::istream* is = nullptr;
+    std::streambuf* sb = nullptr;
+};
+#endif  // JSON_NO_IO
+
+// General-purpose iterator-based adapter. It might not be as fast as
+// theoretically possible for some containers, but it is extremely versatile.
+template<typename IteratorType>
+class iterator_input_adapter
+{
+  public:
+    using char_type = typename std::iterator_traits<IteratorType>::value_type;
+
+    iterator_input_adapter(IteratorType first, IteratorType last)
+        : current(std::move(first)), end(std::move(last))
+    {}
+
+    typename std::char_traits<char_type>::int_type get_character()
+    {
+        if (JSON_HEDLEY_LIKELY(current != end))
+        {
+            auto result = std::char_traits<char_type>::to_int_type(*current);
+            std::advance(current, 1);
+            return result;
+        }
+
+        return std::char_traits<char_type>::eof();
+    }
+
+  private:
+    IteratorType current;
+    IteratorType end;
+
+    template<typename BaseInputAdapter, size_t T>
+    friend struct wide_string_input_helper;
+
+    bool empty() const
+    {
+        return current == end;
+    }
+};
+
+
+template<typename BaseInputAdapter, size_t T>
+struct wide_string_input_helper;
+
+template<typename BaseInputAdapter>
+struct wide_string_input_helper<BaseInputAdapter, 4>
+{
+    // UTF-32
+    static void fill_buffer(BaseInputAdapter& input,
+                            std::array<std::char_traits<char>::int_type, 4>& utf8_bytes,
+                            size_t& utf8_bytes_index,
+                            size_t& utf8_bytes_filled)
+    {
+        utf8_bytes_index = 0;
+
+        if (JSON_HEDLEY_UNLIKELY(input.empty()))
+        {
+            utf8_bytes[0] = std::char_traits<char>::eof();
+            utf8_bytes_filled = 1;
+        }
+        else
+        {
+            // get the current character
+            const auto wc = input.get_character();
+
+            // UTF-32 to UTF-8 encoding
+            if (wc < 0x80)
+            {
+                utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(wc);
+                utf8_bytes_filled = 1;
+            }
+            else if (wc <= 0x7FF)
+            {
+                utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(0xC0u | ((static_cast<unsigned int>(wc) >> 6u) & 0x1Fu));
+                utf8_bytes[1] = static_cast<std::char_traits<char>::int_type>(0x80u | (static_cast<unsigned int>(wc) & 0x3Fu));
+                utf8_bytes_filled = 2;
+            }
+            else if (wc <= 0xFFFF)
+            {
+                utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(0xE0u | ((static_cast<unsigned int>(wc) >> 12u) & 0x0Fu));
+                utf8_bytes[1] = static_cast<std::char_traits<char>::int_type>(0x80u | ((static_cast<unsigned int>(wc) >> 6u) & 0x3Fu));
+                utf8_bytes[2] = static_cast<std::char_traits<char>::int_type>(0x80u | (static_cast<unsigned int>(wc) & 0x3Fu));
+                utf8_bytes_filled = 3;
+            }
+            else if (wc <= 0x10FFFF)
+            {
+                utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(0xF0u | ((static_cast<unsigned int>(wc) >> 18u) & 0x07u));
+                utf8_bytes[1] = static_cast<std::char_traits<char>::int_type>(0x80u | ((static_cast<unsigned int>(wc) >> 12u) & 0x3Fu));
+                utf8_bytes[2] = static_cast<std::char_traits<char>::int_type>(0x80u | ((static_cast<unsigned int>(wc) >> 6u) & 0x3Fu));
+                utf8_bytes[3] = static_cast<std::char_traits<char>::int_type>(0x80u | (static_cast<unsigned int>(wc) & 0x3Fu));
+                utf8_bytes_filled = 4;
+            }
+            else
+            {
+                // unknown character
+                utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(wc);
+                utf8_bytes_filled = 1;
+            }
+        }
+    }
+};
+
+template<typename BaseInputAdapter>
+struct wide_string_input_helper<BaseInputAdapter, 2>
+{
+    // UTF-16
+    static void fill_buffer(BaseInputAdapter& input,
+                            std::array<std::char_traits<char>::int_type, 4>& utf8_bytes,
+                            size_t& utf8_bytes_index,
+                            size_t& utf8_bytes_filled)
+    {
+        utf8_bytes_index = 0;
+
+        if (JSON_HEDLEY_UNLIKELY(input.empty()))
+        {
+            utf8_bytes[0] = std::char_traits<char>::eof();
+            utf8_bytes_filled = 1;
+        }
+        else
+        {
+            // get the current character
+            const auto wc = input.get_character();
+
+            // UTF-16 to UTF-8 encoding
+            if (wc < 0x80)
+            {
+                utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(wc);
+                utf8_bytes_filled = 1;
+            }
+            else if (wc <= 0x7FF)
+            {
+                utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(0xC0u | ((static_cast<unsigned int>(wc) >> 6u)));
+                utf8_bytes[1] = static_cast<std::char_traits<char>::int_type>(0x80u | (static_cast<unsigned int>(wc) & 0x3Fu));
+                utf8_bytes_filled = 2;
+            }
+            else if (0xD800 > wc || wc >= 0xE000)
+            {
+                utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(0xE0u | ((static_cast<unsigned int>(wc) >> 12u)));
+                utf8_bytes[1] = static_cast<std::char_traits<char>::int_type>(0x80u | ((static_cast<unsigned int>(wc) >> 6u) & 0x3Fu));
+                utf8_bytes[2] = static_cast<std::char_traits<char>::int_type>(0x80u | (static_cast<unsigned int>(wc) & 0x3Fu));
+                utf8_bytes_filled = 3;
+            }
+            else
+            {
+                if (JSON_HEDLEY_UNLIKELY(!input.empty()))
+                {
+                    const auto wc2 = static_cast<unsigned int>(input.get_character());
+                    const auto charcode = 0x10000u + (((static_cast<unsigned int>(wc) & 0x3FFu) << 10u) | (wc2 & 0x3FFu));
+                    utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(0xF0u | (charcode >> 18u));
+                    utf8_bytes[1] = static_cast<std::char_traits<char>::int_type>(0x80u | ((charcode >> 12u) & 0x3Fu));
+                    utf8_bytes[2] = static_cast<std::char_traits<char>::int_type>(0x80u | ((charcode >> 6u) & 0x3Fu));
+                    utf8_bytes[3] = static_cast<std::char_traits<char>::int_type>(0x80u | (charcode & 0x3Fu));
+                    utf8_bytes_filled = 4;
+                }
+                else
+                {
+                    utf8_bytes[0] = static_cast<std::char_traits<char>::int_type>(wc);
+                    utf8_bytes_filled = 1;
+                }
+            }
+        }
+    }
+};
+
+// Wraps another input apdater to convert wide character types into individual bytes.
+template<typename BaseInputAdapter, typename WideCharType>
+class wide_string_input_adapter
+{
+  public:
+    using char_type = char;
+
+    wide_string_input_adapter(BaseInputAdapter base)
+        : base_adapter(base) {}
+
+    typename std::char_traits<char>::int_type get_character() noexcept
+    {
+        // check if buffer needs to be filled
+        if (utf8_bytes_index == utf8_bytes_filled)
+        {
+            fill_buffer<sizeof(WideCharType)>();
+
+            JSON_ASSERT(utf8_bytes_filled > 0);
+            JSON_ASSERT(utf8_bytes_index == 0);
+        }
+
+        // use buffer
+        JSON_ASSERT(utf8_bytes_filled > 0);
+        JSON_ASSERT(utf8_bytes_index < utf8_bytes_filled);
+        return utf8_bytes[utf8_bytes_index++];
+    }
+
+  private:
+    BaseInputAdapter base_adapter;
+
+    template<size_t T>
+    void fill_buffer()
+    {
+        wide_string_input_helper<BaseInputAdapter, T>::fill_buffer(base_adapter, utf8_bytes, utf8_bytes_index, utf8_bytes_filled);
+    }
+
+    /// a buffer for UTF-8 bytes
+    std::array<std::char_traits<char>::int_type, 4> utf8_bytes = {{0, 0, 0, 0}};
+
+    /// index to the utf8_codes array for the next valid byte
+    std::size_t utf8_bytes_index = 0;
+    /// number of valid bytes in the utf8_codes array
+    std::size_t utf8_bytes_filled = 0;
+};
+
+
+template<typename IteratorType, typename Enable = void>
+struct iterator_input_adapter_factory
+{
+    using iterator_type = IteratorType;
+    using char_type = typename std::iterator_traits<iterator_type>::value_type;
+    using adapter_type = iterator_input_adapter<iterator_type>;
+
+    static adapter_type create(IteratorType first, IteratorType last)
+    {
+        return adapter_type(std::move(first), std::move(last));
+    }
+};
+
+template<typename T>
+struct is_iterator_of_multibyte
+{
+    using value_type = typename std::iterator_traits<T>::value_type;
+    enum
+    {
+        value = sizeof(value_type) > 1
+    };
+};
+
+template<typename IteratorType>
+struct iterator_input_adapter_factory<IteratorType, enable_if_t<is_iterator_of_multibyte<IteratorType>::value>>
+{
+    using iterator_type = IteratorType;
+    using char_type = typename std::iterator_traits<iterator_type>::value_type;
+    using base_adapter_type = iterator_input_adapter<iterator_type>;
+    using adapter_type = wide_string_input_adapter<base_adapter_type, char_type>;
+
+    static adapter_type create(IteratorType first, IteratorType last)
+    {
+        return adapter_type(base_adapter_type(std::move(first), std::move(last)));
+    }
+};
+
+// General purpose iterator-based input
+template<typename IteratorType>
+typename iterator_input_adapter_factory<IteratorType>::adapter_type input_adapter(IteratorType first, IteratorType last)
+{
+    using factory_type = iterator_input_adapter_factory<IteratorType>;
+    return factory_type::create(first, last);
+}
+
+// Convenience shorthand from container to iterator
+// Enables ADL on begin(container) and end(container)
+// Encloses the using declarations in namespace for not to leak them to outside scope
+
+namespace container_input_adapter_factory_impl
+{
+
+using std::begin;
+using std::end;
+
+template<typename ContainerType, typename Enable = void>
+struct container_input_adapter_factory {};
+
+template<typename ContainerType>
+struct container_input_adapter_factory< ContainerType,
+       void_t<decltype(begin(std::declval<ContainerType>()), end(std::declval<ContainerType>()))>>
+       {
+           using adapter_type = decltype(input_adapter(begin(std::declval<ContainerType>()), end(std::declval<ContainerType>())));
+
+           static adapter_type create(const ContainerType& container)
+{
+    return input_adapter(begin(container), end(container));
+}
+       };
+
+} // namespace container_input_adapter_factory_impl
+
+template<typename ContainerType>
+typename container_input_adapter_factory_impl::container_input_adapter_factory<ContainerType>::adapter_type input_adapter(const ContainerType& container)
+{
+    return container_input_adapter_factory_impl::container_input_adapter_factory<ContainerType>::create(container);
+}
+
+#ifndef JSON_NO_IO
+// Special cases with fast paths
+inline file_input_adapter input_adapter(std::FILE* file)
+{
+    return file_input_adapter(file);
+}
+
+inline input_stream_adapter input_adapter(std::istream& stream)
+{
+    return input_stream_adapter(stream);
+}
+
+inline input_stream_adapter input_adapter(std::istream&& stream)
+{
+    return input_stream_adapter(stream);
+}
+#endif  // JSON_NO_IO
+
+using contiguous_bytes_input_adapter = decltype(input_adapter(std::declval<const char*>(), std::declval<const char*>()));
+
+// Null-delimited strings, and the like.
+template < typename CharT,
+           typename std::enable_if <
+               std::is_pointer<CharT>::value&&
+               !std::is_array<CharT>::value&&
+               std::is_integral<typename std::remove_pointer<CharT>::type>::value&&
+               sizeof(typename std::remove_pointer<CharT>::type) == 1,
+               int >::type = 0 >
+contiguous_bytes_input_adapter input_adapter(CharT b)
+{
+    auto length = std::strlen(reinterpret_cast<const char*>(b));
+    const auto* ptr = reinterpret_cast<const char*>(b);
+    return input_adapter(ptr, ptr + length);
+}
+
+template<typename T, std::size_t N>
+auto input_adapter(T (&array)[N]) -> decltype(input_adapter(array, array + N)) // NOLINT(cppcoreguidelines-avoid-c-arrays,hicpp-avoid-c-arrays,modernize-avoid-c-arrays)
+{
+    return input_adapter(array, array + N);
+}
+
+// This class only handles inputs of input_buffer_adapter type.
+// It's required so that expressions like {ptr, len} can be implicitly cast
+// to the correct adapter.
+class span_input_adapter
+{
+  public:
+    template < typename CharT,
+               typename std::enable_if <
+                   std::is_pointer<CharT>::value&&
+                   std::is_integral<typename std::remove_pointer<CharT>::type>::value&&
+                   sizeof(typename std::remove_pointer<CharT>::type) == 1,
+                   int >::type = 0 >
+    span_input_adapter(CharT b, std::size_t l)
+        : ia(reinterpret_cast<const char*>(b), reinterpret_cast<const char*>(b) + l) {}
+
+    template<class IteratorType,
+             typename std::enable_if<
+                 std::is_same<typename iterator_traits<IteratorType>::iterator_category, std::random_access_iterator_tag>::value,
+                 int>::type = 0>
+    span_input_adapter(IteratorType first, IteratorType last)
+        : ia(input_adapter(first, last)) {}
+
+    contiguous_bytes_input_adapter&& get()
+    {
+        return std::move(ia); // NOLINT(hicpp-move-const-arg,performance-move-const-arg)
+    }
+
+  private:
+    contiguous_bytes_input_adapter ia;
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/input/json_sax.hpp>
+
+
+#include <cstddef>
+#include <string> // string
+#include <utility> // move
+#include <vector> // vector
+
+// #include <nlohmann/detail/exceptions.hpp>
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+
+/*!
+@brief SAX interface
+
+This class describes the SAX interface used by @ref nlohmann::json::sax_parse.
+Each function is called in different situations while the input is parsed. The
+boolean return value informs the parser whether to continue processing the
+input.
+*/
+template<typename BasicJsonType>
+struct json_sax
+{
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using binary_t = typename BasicJsonType::binary_t;
+
+    /*!
+    @brief a null value was read
+    @return whether parsing should proceed
+    */
+    virtual bool null() = 0;
+
+    /*!
+    @brief a boolean value was read
+    @param[in] val  boolean value
+    @return whether parsing should proceed
+    */
+    virtual bool boolean(bool val) = 0;
+
+    /*!
+    @brief an integer number was read
+    @param[in] val  integer value
+    @return whether parsing should proceed
+    */
+    virtual bool number_integer(number_integer_t val) = 0;
+
+    /*!
+    @brief an unsigned integer number was read
+    @param[in] val  unsigned integer value
+    @return whether parsing should proceed
+    */
+    virtual bool number_unsigned(number_unsigned_t val) = 0;
+
+    /*!
+    @brief a floating-point number was read
+    @param[in] val  floating-point value
+    @param[in] s    raw token value
+    @return whether parsing should proceed
+    */
+    virtual bool number_float(number_float_t val, const string_t& s) = 0;
+
+    /*!
+    @brief a string value was read
+    @param[in] val  string value
+    @return whether parsing should proceed
+    @note It is safe to move the passed string value.
+    */
+    virtual bool string(string_t& val) = 0;
+
+    /*!
+    @brief a binary value was read
+    @param[in] val  binary value
+    @return whether parsing should proceed
+    @note It is safe to move the passed binary value.
+    */
+    virtual bool binary(binary_t& val) = 0;
+
+    /*!
+    @brief the beginning of an object was read
+    @param[in] elements  number of object elements or -1 if unknown
+    @return whether parsing should proceed
+    @note binary formats may report the number of elements
+    */
+    virtual bool start_object(std::size_t elements) = 0;
+
+    /*!
+    @brief an object key was read
+    @param[in] val  object key
+    @return whether parsing should proceed
+    @note It is safe to move the passed string.
+    */
+    virtual bool key(string_t& val) = 0;
+
+    /*!
+    @brief the end of an object was read
+    @return whether parsing should proceed
+    */
+    virtual bool end_object() = 0;
+
+    /*!
+    @brief the beginning of an array was read
+    @param[in] elements  number of array elements or -1 if unknown
+    @return whether parsing should proceed
+    @note binary formats may report the number of elements
+    */
+    virtual bool start_array(std::size_t elements) = 0;
+
+    /*!
+    @brief the end of an array was read
+    @return whether parsing should proceed
+    */
+    virtual bool end_array() = 0;
+
+    /*!
+    @brief a parse error occurred
+    @param[in] position    the position in the input where the error occurs
+    @param[in] last_token  the last read token
+    @param[in] ex          an exception object describing the error
+    @return whether parsing should proceed (must return false)
+    */
+    virtual bool parse_error(std::size_t position,
+                             const std::string& last_token,
+                             const detail::exception& ex) = 0;
+
+    json_sax() = default;
+    json_sax(const json_sax&) = default;
+    json_sax(json_sax&&) noexcept = default;
+    json_sax& operator=(const json_sax&) = default;
+    json_sax& operator=(json_sax&&) noexcept = default;
+    virtual ~json_sax() = default;
+};
+
+
+namespace detail
+{
+/*!
+@brief SAX implementation to create a JSON value from SAX events
+
+This class implements the @ref json_sax interface and processes the SAX events
+to create a JSON value which makes it basically a DOM parser. The structure or
+hierarchy of the JSON value is managed by the stack `ref_stack` which contains
+a pointer to the respective array or object for each recursion depth.
+
+After successful parsing, the value that is passed by reference to the
+constructor contains the parsed value.
+
+@tparam BasicJsonType  the JSON type
+*/
+template<typename BasicJsonType>
+class json_sax_dom_parser
+{
+  public:
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using binary_t = typename BasicJsonType::binary_t;
+
+    /*!
+    @param[in,out] r  reference to a JSON value that is manipulated while
+                       parsing
+    @param[in] allow_exceptions_  whether parse errors yield exceptions
+    */
+    explicit json_sax_dom_parser(BasicJsonType& r, const bool allow_exceptions_ = true)
+        : root(r), allow_exceptions(allow_exceptions_)
+    {}
+
+    // make class move-only
+    json_sax_dom_parser(const json_sax_dom_parser&) = delete;
+    json_sax_dom_parser(json_sax_dom_parser&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
+    json_sax_dom_parser& operator=(const json_sax_dom_parser&) = delete;
+    json_sax_dom_parser& operator=(json_sax_dom_parser&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
+    ~json_sax_dom_parser() = default;
+
+    bool null()
+    {
+        handle_value(nullptr);
+        return true;
+    }
+
+    bool boolean(bool val)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool number_integer(number_integer_t val)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool number_unsigned(number_unsigned_t val)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool number_float(number_float_t val, const string_t& /*unused*/)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool string(string_t& val)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool binary(binary_t& val)
+    {
+        handle_value(std::move(val));
+        return true;
+    }
+
+    bool start_object(std::size_t len)
+    {
+        ref_stack.push_back(handle_value(BasicJsonType::value_t::object));
+
+        if (JSON_HEDLEY_UNLIKELY(len != static_cast<std::size_t>(-1) && len > ref_stack.back()->max_size()))
+        {
+            JSON_THROW(out_of_range::create(408, "excessive object size: " + std::to_string(len), *ref_stack.back()));
+        }
+
+        return true;
+    }
+
+    bool key(string_t& val)
+    {
+        // add null at given key and store the reference for later
+        object_element = &(ref_stack.back()->m_value.object->operator[](val));
+        return true;
+    }
+
+    bool end_object()
+    {
+        ref_stack.back()->set_parents();
+        ref_stack.pop_back();
+        return true;
+    }
+
+    bool start_array(std::size_t len)
+    {
+        ref_stack.push_back(handle_value(BasicJsonType::value_t::array));
+
+        if (JSON_HEDLEY_UNLIKELY(len != static_cast<std::size_t>(-1) && len > ref_stack.back()->max_size()))
+        {
+            JSON_THROW(out_of_range::create(408, "excessive array size: " + std::to_string(len), *ref_stack.back()));
+        }
+
+        return true;
+    }
+
+    bool end_array()
+    {
+        ref_stack.back()->set_parents();
+        ref_stack.pop_back();
+        return true;
+    }
+
+    template<class Exception>
+    bool parse_error(std::size_t /*unused*/, const std::string& /*unused*/,
+                     const Exception& ex)
+    {
+        errored = true;
+        static_cast<void>(ex);
+        if (allow_exceptions)
+        {
+            JSON_THROW(ex);
+        }
+        return false;
+    }
+
+    constexpr bool is_errored() const
+    {
+        return errored;
+    }
+
+  private:
+    /*!
+    @invariant If the ref stack is empty, then the passed value will be the new
+               root.
+    @invariant If the ref stack contains a value, then it is an array or an
+               object to which we can add elements
+    */
+    template<typename Value>
+    JSON_HEDLEY_RETURNS_NON_NULL
+    BasicJsonType* handle_value(Value&& v)
+    {
+        if (ref_stack.empty())
+        {
+            root = BasicJsonType(std::forward<Value>(v));
+            return &root;
+        }
+
+        JSON_ASSERT(ref_stack.back()->is_array() || ref_stack.back()->is_object());
+
+        if (ref_stack.back()->is_array())
+        {
+            ref_stack.back()->m_value.array->emplace_back(std::forward<Value>(v));
+            return &(ref_stack.back()->m_value.array->back());
+        }
+
+        JSON_ASSERT(ref_stack.back()->is_object());
+        JSON_ASSERT(object_element);
+        *object_element = BasicJsonType(std::forward<Value>(v));
+        return object_element;
+    }
+
+    /// the parsed JSON value
+    BasicJsonType& root;
+    /// stack to model hierarchy of values
+    std::vector<BasicJsonType*> ref_stack {};
+    /// helper to hold the reference for the next object element
+    BasicJsonType* object_element = nullptr;
+    /// whether a syntax error occurred
+    bool errored = false;
+    /// whether to throw exceptions in case of errors
+    const bool allow_exceptions = true;
+};
+
+template<typename BasicJsonType>
+class json_sax_dom_callback_parser
+{
+  public:
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using binary_t = typename BasicJsonType::binary_t;
+    using parser_callback_t = typename BasicJsonType::parser_callback_t;
+    using parse_event_t = typename BasicJsonType::parse_event_t;
+
+    json_sax_dom_callback_parser(BasicJsonType& r,
+                                 const parser_callback_t cb,
+                                 const bool allow_exceptions_ = true)
+        : root(r), callback(cb), allow_exceptions(allow_exceptions_)
+    {
+        keep_stack.push_back(true);
+    }
+
+    // make class move-only
+    json_sax_dom_callback_parser(const json_sax_dom_callback_parser&) = delete;
+    json_sax_dom_callback_parser(json_sax_dom_callback_parser&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
+    json_sax_dom_callback_parser& operator=(const json_sax_dom_callback_parser&) = delete;
+    json_sax_dom_callback_parser& operator=(json_sax_dom_callback_parser&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
+    ~json_sax_dom_callback_parser() = default;
+
+    bool null()
+    {
+        handle_value(nullptr);
+        return true;
+    }
+
+    bool boolean(bool val)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool number_integer(number_integer_t val)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool number_unsigned(number_unsigned_t val)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool number_float(number_float_t val, const string_t& /*unused*/)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool string(string_t& val)
+    {
+        handle_value(val);
+        return true;
+    }
+
+    bool binary(binary_t& val)
+    {
+        handle_value(std::move(val));
+        return true;
+    }
+
+    bool start_object(std::size_t len)
+    {
+        // check callback for object start
+        const bool keep = callback(static_cast<int>(ref_stack.size()), parse_event_t::object_start, discarded);
+        keep_stack.push_back(keep);
+
+        auto val = handle_value(BasicJsonType::value_t::object, true);
+        ref_stack.push_back(val.second);
+
+        // check object limit
+        if (ref_stack.back() && JSON_HEDLEY_UNLIKELY(len != static_cast<std::size_t>(-1) && len > ref_stack.back()->max_size()))
+        {
+            JSON_THROW(out_of_range::create(408, "excessive object size: " + std::to_string(len), *ref_stack.back()));
+        }
+
+        return true;
+    }
+
+    bool key(string_t& val)
+    {
+        BasicJsonType k = BasicJsonType(val);
+
+        // check callback for key
+        const bool keep = callback(static_cast<int>(ref_stack.size()), parse_event_t::key, k);
+        key_keep_stack.push_back(keep);
+
+        // add discarded value at given key and store the reference for later
+        if (keep && ref_stack.back())
+        {
+            object_element = &(ref_stack.back()->m_value.object->operator[](val) = discarded);
+        }
+
+        return true;
+    }
+
+    bool end_object()
+    {
+        if (ref_stack.back())
+        {
+            if (!callback(static_cast<int>(ref_stack.size()) - 1, parse_event_t::object_end, *ref_stack.back()))
+            {
+                // discard object
+                *ref_stack.back() = discarded;
+            }
+            else
+            {
+                ref_stack.back()->set_parents();
+            }
+        }
+
+        JSON_ASSERT(!ref_stack.empty());
+        JSON_ASSERT(!keep_stack.empty());
+        ref_stack.pop_back();
+        keep_stack.pop_back();
+
+        if (!ref_stack.empty() && ref_stack.back() && ref_stack.back()->is_structured())
+        {
+            // remove discarded value
+            for (auto it = ref_stack.back()->begin(); it != ref_stack.back()->end(); ++it)
+            {
+                if (it->is_discarded())
+                {
+                    ref_stack.back()->erase(it);
+                    break;
+                }
+            }
+        }
+
+        return true;
+    }
+
+    bool start_array(std::size_t len)
+    {
+        const bool keep = callback(static_cast<int>(ref_stack.size()), parse_event_t::array_start, discarded);
+        keep_stack.push_back(keep);
+
+        auto val = handle_value(BasicJsonType::value_t::array, true);
+        ref_stack.push_back(val.second);
+
+        // check array limit
+        if (ref_stack.back() && JSON_HEDLEY_UNLIKELY(len != static_cast<std::size_t>(-1) && len > ref_stack.back()->max_size()))
+        {
+            JSON_THROW(out_of_range::create(408, "excessive array size: " + std::to_string(len), *ref_stack.back()));
+        }
+
+        return true;
+    }
+
+    bool end_array()
+    {
+        bool keep = true;
+
+        if (ref_stack.back())
+        {
+            keep = callback(static_cast<int>(ref_stack.size()) - 1, parse_event_t::array_end, *ref_stack.back());
+            if (keep)
+            {
+                ref_stack.back()->set_parents();
+            }
+            else
+            {
+                // discard array
+                *ref_stack.back() = discarded;
+            }
+        }
+
+        JSON_ASSERT(!ref_stack.empty());
+        JSON_ASSERT(!keep_stack.empty());
+        ref_stack.pop_back();
+        keep_stack.pop_back();
+
+        // remove discarded value
+        if (!keep && !ref_stack.empty() && ref_stack.back()->is_array())
+        {
+            ref_stack.back()->m_value.array->pop_back();
+        }
+
+        return true;
+    }
+
+    template<class Exception>
+    bool parse_error(std::size_t /*unused*/, const std::string& /*unused*/,
+                     const Exception& ex)
+    {
+        errored = true;
+        static_cast<void>(ex);
+        if (allow_exceptions)
+        {
+            JSON_THROW(ex);
+        }
+        return false;
+    }
+
+    constexpr bool is_errored() const
+    {
+        return errored;
+    }
+
+  private:
+    /*!
+    @param[in] v  value to add to the JSON value we build during parsing
+    @param[in] skip_callback  whether we should skip calling the callback
+               function; this is required after start_array() and
+               start_object() SAX events, because otherwise we would call the
+               callback function with an empty array or object, respectively.
+
+    @invariant If the ref stack is empty, then the passed value will be the new
+               root.
+    @invariant If the ref stack contains a value, then it is an array or an
+               object to which we can add elements
+
+    @return pair of boolean (whether value should be kept) and pointer (to the
+            passed value in the ref_stack hierarchy; nullptr if not kept)
+    */
+    template<typename Value>
+    std::pair<bool, BasicJsonType*> handle_value(Value&& v, const bool skip_callback = false)
+    {
+        JSON_ASSERT(!keep_stack.empty());
+
+        // do not handle this value if we know it would be added to a discarded
+        // container
+        if (!keep_stack.back())
+        {
+            return {false, nullptr};
+        }
+
+        // create value
+        auto value = BasicJsonType(std::forward<Value>(v));
+
+        // check callback
+        const bool keep = skip_callback || callback(static_cast<int>(ref_stack.size()), parse_event_t::value, value);
+
+        // do not handle this value if we just learnt it shall be discarded
+        if (!keep)
+        {
+            return {false, nullptr};
+        }
+
+        if (ref_stack.empty())
+        {
+            root = std::move(value);
+            return {true, &root};
+        }
+
+        // skip this value if we already decided to skip the parent
+        // (https://github.com/nlohmann/json/issues/971#issuecomment-413678360)
+        if (!ref_stack.back())
+        {
+            return {false, nullptr};
+        }
+
+        // we now only expect arrays and objects
+        JSON_ASSERT(ref_stack.back()->is_array() || ref_stack.back()->is_object());
+
+        // array
+        if (ref_stack.back()->is_array())
+        {
+            ref_stack.back()->m_value.array->emplace_back(std::move(value));
+            return {true, &(ref_stack.back()->m_value.array->back())};
+        }
+
+        // object
+        JSON_ASSERT(ref_stack.back()->is_object());
+        // check if we should store an element for the current key
+        JSON_ASSERT(!key_keep_stack.empty());
+        const bool store_element = key_keep_stack.back();
+        key_keep_stack.pop_back();
+
+        if (!store_element)
+        {
+            return {false, nullptr};
+        }
+
+        JSON_ASSERT(object_element);
+        *object_element = std::move(value);
+        return {true, object_element};
+    }
+
+    /// the parsed JSON value
+    BasicJsonType& root;
+    /// stack to model hierarchy of values
+    std::vector<BasicJsonType*> ref_stack {};
+    /// stack to manage which values to keep
+    std::vector<bool> keep_stack {};
+    /// stack to manage which object keys to keep
+    std::vector<bool> key_keep_stack {};
+    /// helper to hold the reference for the next object element
+    BasicJsonType* object_element = nullptr;
+    /// whether a syntax error occurred
+    bool errored = false;
+    /// callback function
+    const parser_callback_t callback = nullptr;
+    /// whether to throw exceptions in case of errors
+    const bool allow_exceptions = true;
+    /// a discarded value for the callback
+    BasicJsonType discarded = BasicJsonType::value_t::discarded;
+};
+
+template<typename BasicJsonType>
+class json_sax_acceptor
+{
+  public:
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using binary_t = typename BasicJsonType::binary_t;
+
+    bool null()
+    {
+        return true;
+    }
+
+    bool boolean(bool /*unused*/)
+    {
+        return true;
+    }
+
+    bool number_integer(number_integer_t /*unused*/)
+    {
+        return true;
+    }
+
+    bool number_unsigned(number_unsigned_t /*unused*/)
+    {
+        return true;
+    }
+
+    bool number_float(number_float_t /*unused*/, const string_t& /*unused*/)
+    {
+        return true;
+    }
+
+    bool string(string_t& /*unused*/)
+    {
+        return true;
+    }
+
+    bool binary(binary_t& /*unused*/)
+    {
+        return true;
+    }
+
+    bool start_object(std::size_t /*unused*/ = static_cast<std::size_t>(-1))
+    {
+        return true;
+    }
+
+    bool key(string_t& /*unused*/)
+    {
+        return true;
+    }
+
+    bool end_object()
+    {
+        return true;
+    }
+
+    bool start_array(std::size_t /*unused*/ = static_cast<std::size_t>(-1))
+    {
+        return true;
+    }
+
+    bool end_array()
+    {
+        return true;
+    }
+
+    bool parse_error(std::size_t /*unused*/, const std::string& /*unused*/, const detail::exception& /*unused*/)
+    {
+        return false;
+    }
+};
+}  // namespace detail
+
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/input/lexer.hpp>
+
+
+#include <array> // array
+#include <clocale> // localeconv
+#include <cstddef> // size_t
+#include <cstdio> // snprintf
+#include <cstdlib> // strtof, strtod, strtold, strtoll, strtoull
+#include <initializer_list> // initializer_list
+#include <string> // char_traits, string
+#include <utility> // move
+#include <vector> // vector
+
+// #include <nlohmann/detail/input/input_adapters.hpp>
+
+// #include <nlohmann/detail/input/position_t.hpp>
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+///////////
+// lexer //
+///////////
+
+template<typename BasicJsonType>
+class lexer_base
+{
+  public:
+    /// token types for the parser
+    enum class token_type
+    {
+        uninitialized,    ///< indicating the scanner is uninitialized
+        literal_true,     ///< the `true` literal
+        literal_false,    ///< the `false` literal
+        literal_null,     ///< the `null` literal
+        value_string,     ///< a string -- use get_string() for actual value
+        value_unsigned,   ///< an unsigned integer -- use get_number_unsigned() for actual value
+        value_integer,    ///< a signed integer -- use get_number_integer() for actual value
+        value_float,      ///< an floating point number -- use get_number_float() for actual value
+        begin_array,      ///< the character for array begin `[`
+        begin_object,     ///< the character for object begin `{`
+        end_array,        ///< the character for array end `]`
+        end_object,       ///< the character for object end `}`
+        name_separator,   ///< the name separator `:`
+        value_separator,  ///< the value separator `,`
+        parse_error,      ///< indicating a parse error
+        end_of_input,     ///< indicating the end of the input buffer
+        literal_or_value  ///< a literal or the begin of a value (only for diagnostics)
+    };
+
+    /// return name of values of type token_type (only used for errors)
+    JSON_HEDLEY_RETURNS_NON_NULL
+    JSON_HEDLEY_CONST
+    static const char* token_type_name(const token_type t) noexcept
+    {
+        switch (t)
+        {
+            case token_type::uninitialized:
+                return "<uninitialized>";
+            case token_type::literal_true:
+                return "true literal";
+            case token_type::literal_false:
+                return "false literal";
+            case token_type::literal_null:
+                return "null literal";
+            case token_type::value_string:
+                return "string literal";
+            case token_type::value_unsigned:
+            case token_type::value_integer:
+            case token_type::value_float:
+                return "number literal";
+            case token_type::begin_array:
+                return "'['";
+            case token_type::begin_object:
+                return "'{'";
+            case token_type::end_array:
+                return "']'";
+            case token_type::end_object:
+                return "'}'";
+            case token_type::name_separator:
+                return "':'";
+            case token_type::value_separator:
+                return "','";
+            case token_type::parse_error:
+                return "<parse error>";
+            case token_type::end_of_input:
+                return "end of input";
+            case token_type::literal_or_value:
+                return "'[', '{', or a literal";
+            // LCOV_EXCL_START
+            default: // catch non-enum values
+                return "unknown token";
+                // LCOV_EXCL_STOP
+        }
+    }
+};
+/*!
+@brief lexical analysis
+
+This class organizes the lexical analysis during JSON deserialization.
+*/
+template<typename BasicJsonType, typename InputAdapterType>
+class lexer : public lexer_base<BasicJsonType>
+{
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using char_type = typename InputAdapterType::char_type;
+    using char_int_type = typename std::char_traits<char_type>::int_type;
+
+  public:
+    using token_type = typename lexer_base<BasicJsonType>::token_type;
+
+    explicit lexer(InputAdapterType&& adapter, bool ignore_comments_ = false) noexcept
+        : ia(std::move(adapter))
+        , ignore_comments(ignore_comments_)
+        , decimal_point_char(static_cast<char_int_type>(get_decimal_point()))
+    {}
+
+    // delete because of pointer members
+    lexer(const lexer&) = delete;
+    lexer(lexer&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
+    lexer& operator=(lexer&) = delete;
+    lexer& operator=(lexer&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
+    ~lexer() = default;
+
+  private:
+    /////////////////////
+    // locales
+    /////////////////////
+
+    /// return the locale-dependent decimal point
+    JSON_HEDLEY_PURE
+    static char get_decimal_point() noexcept
+    {
+        const auto* loc = localeconv();
+        JSON_ASSERT(loc != nullptr);
+        return (loc->decimal_point == nullptr) ? '.' : *(loc->decimal_point);
+    }
+
+    /////////////////////
+    // scan functions
+    /////////////////////
+
+    /*!
+    @brief get codepoint from 4 hex characters following `\u`
+
+    For input "\u c1 c2 c3 c4" the codepoint is:
+      (c1 * 0x1000) + (c2 * 0x0100) + (c3 * 0x0010) + c4
+    = (c1 << 12) + (c2 << 8) + (c3 << 4) + (c4 << 0)
+
+    Furthermore, the possible characters '0'..'9', 'A'..'F', and 'a'..'f'
+    must be converted to the integers 0x0..0x9, 0xA..0xF, 0xA..0xF, resp. The
+    conversion is done by subtracting the offset (0x30, 0x37, and 0x57)
+    between the ASCII value of the character and the desired integer value.
+
+    @return codepoint (0x0000..0xFFFF) or -1 in case of an error (e.g. EOF or
+            non-hex character)
+    */
+    int get_codepoint()
+    {
+        // this function only makes sense after reading `\u`
+        JSON_ASSERT(current == 'u');
+        int codepoint = 0;
+
+        const auto factors = { 12u, 8u, 4u, 0u };
+        for (const auto factor : factors)
+        {
+            get();
+
+            if (current >= '0' && current <= '9')
+            {
+                codepoint += static_cast<int>((static_cast<unsigned int>(current) - 0x30u) << factor);
+            }
+            else if (current >= 'A' && current <= 'F')
+            {
+                codepoint += static_cast<int>((static_cast<unsigned int>(current) - 0x37u) << factor);
+            }
+            else if (current >= 'a' && current <= 'f')
+            {
+                codepoint += static_cast<int>((static_cast<unsigned int>(current) - 0x57u) << factor);
+            }
+            else
+            {
+                return -1;
+            }
+        }
+
+        JSON_ASSERT(0x0000 <= codepoint && codepoint <= 0xFFFF);
+        return codepoint;
+    }
+
+    /*!
+    @brief check if the next byte(s) are inside a given range
+
+    Adds the current byte and, for each passed range, reads a new byte and
+    checks if it is inside the range. If a violation was detected, set up an
+    error message and return false. Otherwise, return true.
+
+    @param[in] ranges  list of integers; interpreted as list of pairs of
+                       inclusive lower and upper bound, respectively
+
+    @pre The passed list @a ranges must have 2, 4, or 6 elements; that is,
+         1, 2, or 3 pairs. This precondition is enforced by an assertion.
+
+    @return true if and only if no range violation was detected
+    */
+    bool next_byte_in_range(std::initializer_list<char_int_type> ranges)
+    {
+        JSON_ASSERT(ranges.size() == 2 || ranges.size() == 4 || ranges.size() == 6);
+        add(current);
+
+        for (auto range = ranges.begin(); range != ranges.end(); ++range)
+        {
+            get();
+            if (JSON_HEDLEY_LIKELY(*range <= current && current <= *(++range)))
+            {
+                add(current);
+            }
+            else
+            {
+                error_message = "invalid string: ill-formed UTF-8 byte";
+                return false;
+            }
+        }
+
+        return true;
+    }
+
+    /*!
+    @brief scan a string literal
+
+    This function scans a string according to Sect. 7 of RFC 8259. While
+    scanning, bytes are escaped and copied into buffer token_buffer. Then the
+    function returns successfully, token_buffer is *not* null-terminated (as it
+    may contain \0 bytes), and token_buffer.size() is the number of bytes in the
+    string.
+
+    @return token_type::value_string if string could be successfully scanned,
+            token_type::parse_error otherwise
+
+    @note In case of errors, variable error_message contains a textual
+          description.
+    */
+    token_type scan_string()
+    {
+        // reset token_buffer (ignore opening quote)
+        reset();
+
+        // we entered the function by reading an open quote
+        JSON_ASSERT(current == '\"');
+
+        while (true)
+        {
+            // get next character
+            switch (get())
+            {
+                // end of file while parsing string
+                case std::char_traits<char_type>::eof():
+                {
+                    error_message = "invalid string: missing closing quote";
+                    return token_type::parse_error;
+                }
+
+                // closing quote
+                case '\"':
+                {
+                    return token_type::value_string;
+                }
+
+                // escapes
+                case '\\':
+                {
+                    switch (get())
+                    {
+                        // quotation mark
+                        case '\"':
+                            add('\"');
+                            break;
+                        // reverse solidus
+                        case '\\':
+                            add('\\');
+                            break;
+                        // solidus
+                        case '/':
+                            add('/');
+                            break;
+                        // backspace
+                        case 'b':
+                            add('\b');
+                            break;
+                        // form feed
+                        case 'f':
+                            add('\f');
+                            break;
+                        // line feed
+                        case 'n':
+                            add('\n');
+                            break;
+                        // carriage return
+                        case 'r':
+                            add('\r');
+                            break;
+                        // tab
+                        case 't':
+                            add('\t');
+                            break;
+
+                        // unicode escapes
+                        case 'u':
+                        {
+                            const int codepoint1 = get_codepoint();
+                            int codepoint = codepoint1; // start with codepoint1
+
+                            if (JSON_HEDLEY_UNLIKELY(codepoint1 == -1))
+                            {
+                                error_message = "invalid string: '\\u' must be followed by 4 hex digits";
+                                return token_type::parse_error;
+                            }
+
+                            // check if code point is a high surrogate
+                            if (0xD800 <= codepoint1 && codepoint1 <= 0xDBFF)
+                            {
+                                // expect next \uxxxx entry
+                                if (JSON_HEDLEY_LIKELY(get() == '\\' && get() == 'u'))
+                                {
+                                    const int codepoint2 = get_codepoint();
+
+                                    if (JSON_HEDLEY_UNLIKELY(codepoint2 == -1))
+                                    {
+                                        error_message = "invalid string: '\\u' must be followed by 4 hex digits";
+                                        return token_type::parse_error;
+                                    }
+
+                                    // check if codepoint2 is a low surrogate
+                                    if (JSON_HEDLEY_LIKELY(0xDC00 <= codepoint2 && codepoint2 <= 0xDFFF))
+                                    {
+                                        // overwrite codepoint
+                                        codepoint = static_cast<int>(
+                                                        // high surrogate occupies the most significant 22 bits
+                                                        (static_cast<unsigned int>(codepoint1) << 10u)
+                                                        // low surrogate occupies the least significant 15 bits
+                                                        + static_cast<unsigned int>(codepoint2)
+                                                        // there is still the 0xD800, 0xDC00 and 0x10000 noise
+                                                        // in the result, so we have to subtract with:
+                                                        // (0xD800 << 10) + DC00 - 0x10000 = 0x35FDC00
+                                                        - 0x35FDC00u);
+                                    }
+                                    else
+                                    {
+                                        error_message = "invalid string: surrogate U+D800..U+DBFF must be followed by U+DC00..U+DFFF";
+                                        return token_type::parse_error;
+                                    }
+                                }
+                                else
+                                {
+                                    error_message = "invalid string: surrogate U+D800..U+DBFF must be followed by U+DC00..U+DFFF";
+                                    return token_type::parse_error;
+                                }
+                            }
+                            else
+                            {
+                                if (JSON_HEDLEY_UNLIKELY(0xDC00 <= codepoint1 && codepoint1 <= 0xDFFF))
+                                {
+                                    error_message = "invalid string: surrogate U+DC00..U+DFFF must follow U+D800..U+DBFF";
+                                    return token_type::parse_error;
+                                }
+                            }
+
+                            // result of the above calculation yields a proper codepoint
+                            JSON_ASSERT(0x00 <= codepoint && codepoint <= 0x10FFFF);
+
+                            // translate codepoint into bytes
+                            if (codepoint < 0x80)
+                            {
+                                // 1-byte characters: 0xxxxxxx (ASCII)
+                                add(static_cast<char_int_type>(codepoint));
+                            }
+                            else if (codepoint <= 0x7FF)
+                            {
+                                // 2-byte characters: 110xxxxx 10xxxxxx
+                                add(static_cast<char_int_type>(0xC0u | (static_cast<unsigned int>(codepoint) >> 6u)));
+                                add(static_cast<char_int_type>(0x80u | (static_cast<unsigned int>(codepoint) & 0x3Fu)));
+                            }
+                            else if (codepoint <= 0xFFFF)
+                            {
+                                // 3-byte characters: 1110xxxx 10xxxxxx 10xxxxxx
+                                add(static_cast<char_int_type>(0xE0u | (static_cast<unsigned int>(codepoint) >> 12u)));
+                                add(static_cast<char_int_type>(0x80u | ((static_cast<unsigned int>(codepoint) >> 6u) & 0x3Fu)));
+                                add(static_cast<char_int_type>(0x80u | (static_cast<unsigned int>(codepoint) & 0x3Fu)));
+                            }
+                            else
+                            {
+                                // 4-byte characters: 11110xxx 10xxxxxx 10xxxxxx 10xxxxxx
+                                add(static_cast<char_int_type>(0xF0u | (static_cast<unsigned int>(codepoint) >> 18u)));
+                                add(static_cast<char_int_type>(0x80u | ((static_cast<unsigned int>(codepoint) >> 12u) & 0x3Fu)));
+                                add(static_cast<char_int_type>(0x80u | ((static_cast<unsigned int>(codepoint) >> 6u) & 0x3Fu)));
+                                add(static_cast<char_int_type>(0x80u | (static_cast<unsigned int>(codepoint) & 0x3Fu)));
+                            }
+
+                            break;
+                        }
+
+                        // other characters after escape
+                        default:
+                            error_message = "invalid string: forbidden character after backslash";
+                            return token_type::parse_error;
+                    }
+
+                    break;
+                }
+
+                // invalid control characters
+                case 0x00:
+                {
+                    error_message = "invalid string: control character U+0000 (NUL) must be escaped to \\u0000";
+                    return token_type::parse_error;
+                }
+
+                case 0x01:
+                {
+                    error_message = "invalid string: control character U+0001 (SOH) must be escaped to \\u0001";
+                    return token_type::parse_error;
+                }
+
+                case 0x02:
+                {
+                    error_message = "invalid string: control character U+0002 (STX) must be escaped to \\u0002";
+                    return token_type::parse_error;
+                }
+
+                case 0x03:
+                {
+                    error_message = "invalid string: control character U+0003 (ETX) must be escaped to \\u0003";
+                    return token_type::parse_error;
+                }
+
+                case 0x04:
+                {
+                    error_message = "invalid string: control character U+0004 (EOT) must be escaped to \\u0004";
+                    return token_type::parse_error;
+                }
+
+                case 0x05:
+                {
+                    error_message = "invalid string: control character U+0005 (ENQ) must be escaped to \\u0005";
+                    return token_type::parse_error;
+                }
+
+                case 0x06:
+                {
+                    error_message = "invalid string: control character U+0006 (ACK) must be escaped to \\u0006";
+                    return token_type::parse_error;
+                }
+
+                case 0x07:
+                {
+                    error_message = "invalid string: control character U+0007 (BEL) must be escaped to \\u0007";
+                    return token_type::parse_error;
+                }
+
+                case 0x08:
+                {
+                    error_message = "invalid string: control character U+0008 (BS) must be escaped to \\u0008 or \\b";
+                    return token_type::parse_error;
+                }
+
+                case 0x09:
+                {
+                    error_message = "invalid string: control character U+0009 (HT) must be escaped to \\u0009 or \\t";
+                    return token_type::parse_error;
+                }
+
+                case 0x0A:
+                {
+                    error_message = "invalid string: control character U+000A (LF) must be escaped to \\u000A or \\n";
+                    return token_type::parse_error;
+                }
+
+                case 0x0B:
+                {
+                    error_message = "invalid string: control character U+000B (VT) must be escaped to \\u000B";
+                    return token_type::parse_error;
+                }
+
+                case 0x0C:
+                {
+                    error_message = "invalid string: control character U+000C (FF) must be escaped to \\u000C or \\f";
+                    return token_type::parse_error;
+                }
+
+                case 0x0D:
+                {
+                    error_message = "invalid string: control character U+000D (CR) must be escaped to \\u000D or \\r";
+                    return token_type::parse_error;
+                }
+
+                case 0x0E:
+                {
+                    error_message = "invalid string: control character U+000E (SO) must be escaped to \\u000E";
+                    return token_type::parse_error;
+                }
+
+                case 0x0F:
+                {
+                    error_message = "invalid string: control character U+000F (SI) must be escaped to \\u000F";
+                    return token_type::parse_error;
+                }
+
+                case 0x10:
+                {
+                    error_message = "invalid string: control character U+0010 (DLE) must be escaped to \\u0010";
+                    return token_type::parse_error;
+                }
+
+                case 0x11:
+                {
+                    error_message = "invalid string: control character U+0011 (DC1) must be escaped to \\u0011";
+                    return token_type::parse_error;
+                }
+
+                case 0x12:
+                {
+                    error_message = "invalid string: control character U+0012 (DC2) must be escaped to \\u0012";
+                    return token_type::parse_error;
+                }
+
+                case 0x13:
+                {
+                    error_message = "invalid string: control character U+0013 (DC3) must be escaped to \\u0013";
+                    return token_type::parse_error;
+                }
+
+                case 0x14:
+                {
+                    error_message = "invalid string: control character U+0014 (DC4) must be escaped to \\u0014";
+                    return token_type::parse_error;
+                }
+
+                case 0x15:
+                {
+                    error_message = "invalid string: control character U+0015 (NAK) must be escaped to \\u0015";
+                    return token_type::parse_error;
+                }
+
+                case 0x16:
+                {
+                    error_message = "invalid string: control character U+0016 (SYN) must be escaped to \\u0016";
+                    return token_type::parse_error;
+                }
+
+                case 0x17:
+                {
+                    error_message = "invalid string: control character U+0017 (ETB) must be escaped to \\u0017";
+                    return token_type::parse_error;
+                }
+
+                case 0x18:
+                {
+                    error_message = "invalid string: control character U+0018 (CAN) must be escaped to \\u0018";
+                    return token_type::parse_error;
+                }
+
+                case 0x19:
+                {
+                    error_message = "invalid string: control character U+0019 (EM) must be escaped to \\u0019";
+                    return token_type::parse_error;
+                }
+
+                case 0x1A:
+                {
+                    error_message = "invalid string: control character U+001A (SUB) must be escaped to \\u001A";
+                    return token_type::parse_error;
+                }
+
+                case 0x1B:
+                {
+                    error_message = "invalid string: control character U+001B (ESC) must be escaped to \\u001B";
+                    return token_type::parse_error;
+                }
+
+                case 0x1C:
+                {
+                    error_message = "invalid string: control character U+001C (FS) must be escaped to \\u001C";
+                    return token_type::parse_error;
+                }
+
+                case 0x1D:
+                {
+                    error_message = "invalid string: control character U+001D (GS) must be escaped to \\u001D";
+                    return token_type::parse_error;
+                }
+
+                case 0x1E:
+                {
+                    error_message = "invalid string: control character U+001E (RS) must be escaped to \\u001E";
+                    return token_type::parse_error;
+                }
+
+                case 0x1F:
+                {
+                    error_message = "invalid string: control character U+001F (US) must be escaped to \\u001F";
+                    return token_type::parse_error;
+                }
+
+                // U+0020..U+007F (except U+0022 (quote) and U+005C (backspace))
+                case 0x20:
+                case 0x21:
+                case 0x23:
+                case 0x24:
+                case 0x25:
+                case 0x26:
+                case 0x27:
+                case 0x28:
+                case 0x29:
+                case 0x2A:
+                case 0x2B:
+                case 0x2C:
+                case 0x2D:
+                case 0x2E:
+                case 0x2F:
+                case 0x30:
+                case 0x31:
+                case 0x32:
+                case 0x33:
+                case 0x34:
+                case 0x35:
+                case 0x36:
+                case 0x37:
+                case 0x38:
+                case 0x39:
+                case 0x3A:
+                case 0x3B:
+                case 0x3C:
+                case 0x3D:
+                case 0x3E:
+                case 0x3F:
+                case 0x40:
+                case 0x41:
+                case 0x42:
+                case 0x43:
+                case 0x44:
+                case 0x45:
+                case 0x46:
+                case 0x47:
+                case 0x48:
+                case 0x49:
+                case 0x4A:
+                case 0x4B:
+                case 0x4C:
+                case 0x4D:
+                case 0x4E:
+                case 0x4F:
+                case 0x50:
+                case 0x51:
+                case 0x52:
+                case 0x53:
+                case 0x54:
+                case 0x55:
+                case 0x56:
+                case 0x57:
+                case 0x58:
+                case 0x59:
+                case 0x5A:
+                case 0x5B:
+                case 0x5D:
+                case 0x5E:
+                case 0x5F:
+                case 0x60:
+                case 0x61:
+                case 0x62:
+                case 0x63:
+                case 0x64:
+                case 0x65:
+                case 0x66:
+                case 0x67:
+                case 0x68:
+                case 0x69:
+                case 0x6A:
+                case 0x6B:
+                case 0x6C:
+                case 0x6D:
+                case 0x6E:
+                case 0x6F:
+                case 0x70:
+                case 0x71:
+                case 0x72:
+                case 0x73:
+                case 0x74:
+                case 0x75:
+                case 0x76:
+                case 0x77:
+                case 0x78:
+                case 0x79:
+                case 0x7A:
+                case 0x7B:
+                case 0x7C:
+                case 0x7D:
+                case 0x7E:
+                case 0x7F:
+                {
+                    add(current);
+                    break;
+                }
+
+                // U+0080..U+07FF: bytes C2..DF 80..BF
+                case 0xC2:
+                case 0xC3:
+                case 0xC4:
+                case 0xC5:
+                case 0xC6:
+                case 0xC7:
+                case 0xC8:
+                case 0xC9:
+                case 0xCA:
+                case 0xCB:
+                case 0xCC:
+                case 0xCD:
+                case 0xCE:
+                case 0xCF:
+                case 0xD0:
+                case 0xD1:
+                case 0xD2:
+                case 0xD3:
+                case 0xD4:
+                case 0xD5:
+                case 0xD6:
+                case 0xD7:
+                case 0xD8:
+                case 0xD9:
+                case 0xDA:
+                case 0xDB:
+                case 0xDC:
+                case 0xDD:
+                case 0xDE:
+                case 0xDF:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!next_byte_in_range({0x80, 0xBF})))
+                    {
+                        return token_type::parse_error;
+                    }
+                    break;
+                }
+
+                // U+0800..U+0FFF: bytes E0 A0..BF 80..BF
+                case 0xE0:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!(next_byte_in_range({0xA0, 0xBF, 0x80, 0xBF}))))
+                    {
+                        return token_type::parse_error;
+                    }
+                    break;
+                }
+
+                // U+1000..U+CFFF: bytes E1..EC 80..BF 80..BF
+                // U+E000..U+FFFF: bytes EE..EF 80..BF 80..BF
+                case 0xE1:
+                case 0xE2:
+                case 0xE3:
+                case 0xE4:
+                case 0xE5:
+                case 0xE6:
+                case 0xE7:
+                case 0xE8:
+                case 0xE9:
+                case 0xEA:
+                case 0xEB:
+                case 0xEC:
+                case 0xEE:
+                case 0xEF:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!(next_byte_in_range({0x80, 0xBF, 0x80, 0xBF}))))
+                    {
+                        return token_type::parse_error;
+                    }
+                    break;
+                }
+
+                // U+D000..U+D7FF: bytes ED 80..9F 80..BF
+                case 0xED:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!(next_byte_in_range({0x80, 0x9F, 0x80, 0xBF}))))
+                    {
+                        return token_type::parse_error;
+                    }
+                    break;
+                }
+
+                // U+10000..U+3FFFF F0 90..BF 80..BF 80..BF
+                case 0xF0:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!(next_byte_in_range({0x90, 0xBF, 0x80, 0xBF, 0x80, 0xBF}))))
+                    {
+                        return token_type::parse_error;
+                    }
+                    break;
+                }
+
+                // U+40000..U+FFFFF F1..F3 80..BF 80..BF 80..BF
+                case 0xF1:
+                case 0xF2:
+                case 0xF3:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!(next_byte_in_range({0x80, 0xBF, 0x80, 0xBF, 0x80, 0xBF}))))
+                    {
+                        return token_type::parse_error;
+                    }
+                    break;
+                }
+
+                // U+100000..U+10FFFF F4 80..8F 80..BF 80..BF
+                case 0xF4:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!(next_byte_in_range({0x80, 0x8F, 0x80, 0xBF, 0x80, 0xBF}))))
+                    {
+                        return token_type::parse_error;
+                    }
+                    break;
+                }
+
+                // remaining bytes (80..C1 and F5..FF) are ill-formed
+                default:
+                {
+                    error_message = "invalid string: ill-formed UTF-8 byte";
+                    return token_type::parse_error;
+                }
+            }
+        }
+    }
+
+    /*!
+     * @brief scan a comment
+     * @return whether comment could be scanned successfully
+     */
+    bool scan_comment()
+    {
+        switch (get())
+        {
+            // single-line comments skip input until a newline or EOF is read
+            case '/':
+            {
+                while (true)
+                {
+                    switch (get())
+                    {
+                        case '\n':
+                        case '\r':
+                        case std::char_traits<char_type>::eof():
+                        case '\0':
+                            return true;
+
+                        default:
+                            break;
+                    }
+                }
+            }
+
+            // multi-line comments skip input until */ is read
+            case '*':
+            {
+                while (true)
+                {
+                    switch (get())
+                    {
+                        case std::char_traits<char_type>::eof():
+                        case '\0':
+                        {
+                            error_message = "invalid comment; missing closing '*/'";
+                            return false;
+                        }
+
+                        case '*':
+                        {
+                            switch (get())
+                            {
+                                case '/':
+                                    return true;
+
+                                default:
+                                {
+                                    unget();
+                                    continue;
+                                }
+                            }
+                        }
+
+                        default:
+                            continue;
+                    }
+                }
+            }
+
+            // unexpected character after reading '/'
+            default:
+            {
+                error_message = "invalid comment; expecting '/' or '*' after '/'";
+                return false;
+            }
+        }
+    }
+
+    JSON_HEDLEY_NON_NULL(2)
+    static void strtof(float& f, const char* str, char** endptr) noexcept
+    {
+        f = std::strtof(str, endptr);
+    }
+
+    JSON_HEDLEY_NON_NULL(2)
+    static void strtof(double& f, const char* str, char** endptr) noexcept
+    {
+        f = std::strtod(str, endptr);
+    }
+
+    JSON_HEDLEY_NON_NULL(2)
+    static void strtof(long double& f, const char* str, char** endptr) noexcept
+    {
+        f = std::strtold(str, endptr);
+    }
+
+    /*!
+    @brief scan a number literal
+
+    This function scans a string according to Sect. 6 of RFC 8259.
+
+    The function is realized with a deterministic finite state machine derived
+    from the grammar described in RFC 8259. Starting in state "init", the
+    input is read and used to determined the next state. Only state "done"
+    accepts the number. State "error" is a trap state to model errors. In the
+    table below, "anything" means any character but the ones listed before.
+
+    state    | 0        | 1-9      | e E      | +       | -       | .        | anything
+    ---------|----------|----------|----------|---------|---------|----------|-----------
+    init     | zero     | any1     | [error]  | [error] | minus   | [error]  | [error]
+    minus    | zero     | any1     | [error]  | [error] | [error] | [error]  | [error]
+    zero     | done     | done     | exponent | done    | done    | decimal1 | done
+    any1     | any1     | any1     | exponent | done    | done    | decimal1 | done
+    decimal1 | decimal2 | decimal2 | [error]  | [error] | [error] | [error]  | [error]
+    decimal2 | decimal2 | decimal2 | exponent | done    | done    | done     | done
+    exponent | any2     | any2     | [error]  | sign    | sign    | [error]  | [error]
+    sign     | any2     | any2     | [error]  | [error] | [error] | [error]  | [error]
+    any2     | any2     | any2     | done     | done    | done    | done     | done
+
+    The state machine is realized with one label per state (prefixed with
+    "scan_number_") and `goto` statements between them. The state machine
+    contains cycles, but any cycle can be left when EOF is read. Therefore,
+    the function is guaranteed to terminate.
+
+    During scanning, the read bytes are stored in token_buffer. This string is
+    then converted to a signed integer, an unsigned integer, or a
+    floating-point number.
+
+    @return token_type::value_unsigned, token_type::value_integer, or
+            token_type::value_float if number could be successfully scanned,
+            token_type::parse_error otherwise
+
+    @note The scanner is independent of the current locale. Internally, the
+          locale's decimal point is used instead of `.` to work with the
+          locale-dependent converters.
+    */
+    token_type scan_number()  // lgtm [cpp/use-of-goto]
+    {
+        // reset token_buffer to store the number's bytes
+        reset();
+
+        // the type of the parsed number; initially set to unsigned; will be
+        // changed if minus sign, decimal point or exponent is read
+        token_type number_type = token_type::value_unsigned;
+
+        // state (init): we just found out we need to scan a number
+        switch (current)
+        {
+            case '-':
+            {
+                add(current);
+                goto scan_number_minus;
+            }
+
+            case '0':
+            {
+                add(current);
+                goto scan_number_zero;
+            }
+
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+            {
+                add(current);
+                goto scan_number_any1;
+            }
+
+            // all other characters are rejected outside scan_number()
+            default:            // LCOV_EXCL_LINE
+                JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+        }
+
+scan_number_minus:
+        // state: we just parsed a leading minus sign
+        number_type = token_type::value_integer;
+        switch (get())
+        {
+            case '0':
+            {
+                add(current);
+                goto scan_number_zero;
+            }
+
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+            {
+                add(current);
+                goto scan_number_any1;
+            }
+
+            default:
+            {
+                error_message = "invalid number; expected digit after '-'";
+                return token_type::parse_error;
+            }
+        }
+
+scan_number_zero:
+        // state: we just parse a zero (maybe with a leading minus sign)
+        switch (get())
+        {
+            case '.':
+            {
+                add(decimal_point_char);
+                goto scan_number_decimal1;
+            }
+
+            case 'e':
+            case 'E':
+            {
+                add(current);
+                goto scan_number_exponent;
+            }
+
+            default:
+                goto scan_number_done;
+        }
+
+scan_number_any1:
+        // state: we just parsed a number 0-9 (maybe with a leading minus sign)
+        switch (get())
+        {
+            case '0':
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+            {
+                add(current);
+                goto scan_number_any1;
+            }
+
+            case '.':
+            {
+                add(decimal_point_char);
+                goto scan_number_decimal1;
+            }
+
+            case 'e':
+            case 'E':
+            {
+                add(current);
+                goto scan_number_exponent;
+            }
+
+            default:
+                goto scan_number_done;
+        }
+
+scan_number_decimal1:
+        // state: we just parsed a decimal point
+        number_type = token_type::value_float;
+        switch (get())
+        {
+            case '0':
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+            {
+                add(current);
+                goto scan_number_decimal2;
+            }
+
+            default:
+            {
+                error_message = "invalid number; expected digit after '.'";
+                return token_type::parse_error;
+            }
+        }
+
+scan_number_decimal2:
+        // we just parsed at least one number after a decimal point
+        switch (get())
+        {
+            case '0':
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+            {
+                add(current);
+                goto scan_number_decimal2;
+            }
+
+            case 'e':
+            case 'E':
+            {
+                add(current);
+                goto scan_number_exponent;
+            }
+
+            default:
+                goto scan_number_done;
+        }
+
+scan_number_exponent:
+        // we just parsed an exponent
+        number_type = token_type::value_float;
+        switch (get())
+        {
+            case '+':
+            case '-':
+            {
+                add(current);
+                goto scan_number_sign;
+            }
+
+            case '0':
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+            {
+                add(current);
+                goto scan_number_any2;
+            }
+
+            default:
+            {
+                error_message =
+                    "invalid number; expected '+', '-', or digit after exponent";
+                return token_type::parse_error;
+            }
+        }
+
+scan_number_sign:
+        // we just parsed an exponent sign
+        switch (get())
+        {
+            case '0':
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+            {
+                add(current);
+                goto scan_number_any2;
+            }
+
+            default:
+            {
+                error_message = "invalid number; expected digit after exponent sign";
+                return token_type::parse_error;
+            }
+        }
+
+scan_number_any2:
+        // we just parsed a number after the exponent or exponent sign
+        switch (get())
+        {
+            case '0':
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+            {
+                add(current);
+                goto scan_number_any2;
+            }
+
+            default:
+                goto scan_number_done;
+        }
+
+scan_number_done:
+        // unget the character after the number (we only read it to know that
+        // we are done scanning a number)
+        unget();
+
+        char* endptr = nullptr; // NOLINT(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+        errno = 0;
+
+        // try to parse integers first and fall back to floats
+        if (number_type == token_type::value_unsigned)
+        {
+            const auto x = std::strtoull(token_buffer.data(), &endptr, 10);
+
+            // we checked the number format before
+            JSON_ASSERT(endptr == token_buffer.data() + token_buffer.size());
+
+            if (errno == 0)
+            {
+                value_unsigned = static_cast<number_unsigned_t>(x);
+                if (value_unsigned == x)
+                {
+                    return token_type::value_unsigned;
+                }
+            }
+        }
+        else if (number_type == token_type::value_integer)
+        {
+            const auto x = std::strtoll(token_buffer.data(), &endptr, 10);
+
+            // we checked the number format before
+            JSON_ASSERT(endptr == token_buffer.data() + token_buffer.size());
+
+            if (errno == 0)
+            {
+                value_integer = static_cast<number_integer_t>(x);
+                if (value_integer == x)
+                {
+                    return token_type::value_integer;
+                }
+            }
+        }
+
+        // this code is reached if we parse a floating-point number or if an
+        // integer conversion above failed
+        strtof(value_float, token_buffer.data(), &endptr);
+
+        // we checked the number format before
+        JSON_ASSERT(endptr == token_buffer.data() + token_buffer.size());
+
+        return token_type::value_float;
+    }
+
+    /*!
+    @param[in] literal_text  the literal text to expect
+    @param[in] length        the length of the passed literal text
+    @param[in] return_type   the token type to return on success
+    */
+    JSON_HEDLEY_NON_NULL(2)
+    token_type scan_literal(const char_type* literal_text, const std::size_t length,
+                            token_type return_type)
+    {
+        JSON_ASSERT(std::char_traits<char_type>::to_char_type(current) == literal_text[0]);
+        for (std::size_t i = 1; i < length; ++i)
+        {
+            if (JSON_HEDLEY_UNLIKELY(std::char_traits<char_type>::to_char_type(get()) != literal_text[i]))
+            {
+                error_message = "invalid literal";
+                return token_type::parse_error;
+            }
+        }
+        return return_type;
+    }
+
+    /////////////////////
+    // input management
+    /////////////////////
+
+    /// reset token_buffer; current character is beginning of token
+    void reset() noexcept
+    {
+        token_buffer.clear();
+        token_string.clear();
+        token_string.push_back(std::char_traits<char_type>::to_char_type(current));
+    }
+
+    /*
+    @brief get next character from the input
+
+    This function provides the interface to the used input adapter. It does
+    not throw in case the input reached EOF, but returns a
+    `std::char_traits<char>::eof()` in that case.  Stores the scanned characters
+    for use in error messages.
+
+    @return character read from the input
+    */
+    char_int_type get()
+    {
+        ++position.chars_read_total;
+        ++position.chars_read_current_line;
+
+        if (next_unget)
+        {
+            // just reset the next_unget variable and work with current
+            next_unget = false;
+        }
+        else
+        {
+            current = ia.get_character();
+        }
+
+        if (JSON_HEDLEY_LIKELY(current != std::char_traits<char_type>::eof()))
+        {
+            token_string.push_back(std::char_traits<char_type>::to_char_type(current));
+        }
+
+        if (current == '\n')
+        {
+            ++position.lines_read;
+            position.chars_read_current_line = 0;
+        }
+
+        return current;
+    }
+
+    /*!
+    @brief unget current character (read it again on next get)
+
+    We implement unget by setting variable next_unget to true. The input is not
+    changed - we just simulate ungetting by modifying chars_read_total,
+    chars_read_current_line, and token_string. The next call to get() will
+    behave as if the unget character is read again.
+    */
+    void unget()
+    {
+        next_unget = true;
+
+        --position.chars_read_total;
+
+        // in case we "unget" a newline, we have to also decrement the lines_read
+        if (position.chars_read_current_line == 0)
+        {
+            if (position.lines_read > 0)
+            {
+                --position.lines_read;
+            }
+        }
+        else
+        {
+            --position.chars_read_current_line;
+        }
+
+        if (JSON_HEDLEY_LIKELY(current != std::char_traits<char_type>::eof()))
+        {
+            JSON_ASSERT(!token_string.empty());
+            token_string.pop_back();
+        }
+    }
+
+    /// add a character to token_buffer
+    void add(char_int_type c)
+    {
+        token_buffer.push_back(static_cast<typename string_t::value_type>(c));
+    }
+
+  public:
+    /////////////////////
+    // value getters
+    /////////////////////
+
+    /// return integer value
+    constexpr number_integer_t get_number_integer() const noexcept
+    {
+        return value_integer;
+    }
+
+    /// return unsigned integer value
+    constexpr number_unsigned_t get_number_unsigned() const noexcept
+    {
+        return value_unsigned;
+    }
+
+    /// return floating-point value
+    constexpr number_float_t get_number_float() const noexcept
+    {
+        return value_float;
+    }
+
+    /// return current string value (implicitly resets the token; useful only once)
+    string_t& get_string()
+    {
+        return token_buffer;
+    }
+
+    /////////////////////
+    // diagnostics
+    /////////////////////
+
+    /// return position of last read token
+    constexpr position_t get_position() const noexcept
+    {
+        return position;
+    }
+
+    /// return the last read token (for errors only).  Will never contain EOF
+    /// (an arbitrary value that is not a valid char value, often -1), because
+    /// 255 may legitimately occur.  May contain NUL, which should be escaped.
+    std::string get_token_string() const
+    {
+        // escape control characters
+        std::string result;
+        for (const auto c : token_string)
+        {
+            if (static_cast<unsigned char>(c) <= '\x1F')
+            {
+                // escape control characters
+                std::array<char, 9> cs{{}};
+                static_cast<void>((std::snprintf)(cs.data(), cs.size(), "<U+%.4X>", static_cast<unsigned char>(c))); // NOLINT(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+                result += cs.data();
+            }
+            else
+            {
+                // add character as is
+                result.push_back(static_cast<std::string::value_type>(c));
+            }
+        }
+
+        return result;
+    }
+
+    /// return syntax error message
+    JSON_HEDLEY_RETURNS_NON_NULL
+    constexpr const char* get_error_message() const noexcept
+    {
+        return error_message;
+    }
+
+    /////////////////////
+    // actual scanner
+    /////////////////////
+
+    /*!
+    @brief skip the UTF-8 byte order mark
+    @return true iff there is no BOM or the correct BOM has been skipped
+    */
+    bool skip_bom()
+    {
+        if (get() == 0xEF)
+        {
+            // check if we completely parse the BOM
+            return get() == 0xBB && get() == 0xBF;
+        }
+
+        // the first character is not the beginning of the BOM; unget it to
+        // process is later
+        unget();
+        return true;
+    }
+
+    void skip_whitespace()
+    {
+        do
+        {
+            get();
+        }
+        while (current == ' ' || current == '\t' || current == '\n' || current == '\r');
+    }
+
+    token_type scan()
+    {
+        // initially, skip the BOM
+        if (position.chars_read_total == 0 && !skip_bom())
+        {
+            error_message = "invalid BOM; must be 0xEF 0xBB 0xBF if given";
+            return token_type::parse_error;
+        }
+
+        // read next character and ignore whitespace
+        skip_whitespace();
+
+        // ignore comments
+        while (ignore_comments && current == '/')
+        {
+            if (!scan_comment())
+            {
+                return token_type::parse_error;
+            }
+
+            // skip following whitespace
+            skip_whitespace();
+        }
+
+        switch (current)
+        {
+            // structural characters
+            case '[':
+                return token_type::begin_array;
+            case ']':
+                return token_type::end_array;
+            case '{':
+                return token_type::begin_object;
+            case '}':
+                return token_type::end_object;
+            case ':':
+                return token_type::name_separator;
+            case ',':
+                return token_type::value_separator;
+
+            // literals
+            case 't':
+            {
+                std::array<char_type, 4> true_literal = {{static_cast<char_type>('t'), static_cast<char_type>('r'), static_cast<char_type>('u'), static_cast<char_type>('e')}};
+                return scan_literal(true_literal.data(), true_literal.size(), token_type::literal_true);
+            }
+            case 'f':
+            {
+                std::array<char_type, 5> false_literal = {{static_cast<char_type>('f'), static_cast<char_type>('a'), static_cast<char_type>('l'), static_cast<char_type>('s'), static_cast<char_type>('e')}};
+                return scan_literal(false_literal.data(), false_literal.size(), token_type::literal_false);
+            }
+            case 'n':
+            {
+                std::array<char_type, 4> null_literal = {{static_cast<char_type>('n'), static_cast<char_type>('u'), static_cast<char_type>('l'), static_cast<char_type>('l')}};
+                return scan_literal(null_literal.data(), null_literal.size(), token_type::literal_null);
+            }
+
+            // string
+            case '\"':
+                return scan_string();
+
+            // number
+            case '-':
+            case '0':
+            case '1':
+            case '2':
+            case '3':
+            case '4':
+            case '5':
+            case '6':
+            case '7':
+            case '8':
+            case '9':
+                return scan_number();
+
+            // end of input (the null byte is needed when parsing from
+            // string literals)
+            case '\0':
+            case std::char_traits<char_type>::eof():
+                return token_type::end_of_input;
+
+            // error
+            default:
+                error_message = "invalid literal";
+                return token_type::parse_error;
+        }
+    }
+
+  private:
+    /// input adapter
+    InputAdapterType ia;
+
+    /// whether comments should be ignored (true) or signaled as errors (false)
+    const bool ignore_comments = false;
+
+    /// the current character
+    char_int_type current = std::char_traits<char_type>::eof();
+
+    /// whether the next get() call should just return current
+    bool next_unget = false;
+
+    /// the start position of the current token
+    position_t position {};
+
+    /// raw input token string (for error messages)
+    std::vector<char_type> token_string {};
+
+    /// buffer for variable-length tokens (numbers, strings)
+    string_t token_buffer {};
+
+    /// a description of occurred lexer errors
+    const char* error_message = "";
+
+    // number values
+    number_integer_t value_integer = 0;
+    number_unsigned_t value_unsigned = 0;
+    number_float_t value_float = 0;
+
+    /// the decimal point
+    const char_int_type decimal_point_char = '.';
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/meta/is_sax.hpp>
+
+
+#include <cstdint> // size_t
+#include <utility> // declval
+#include <string> // string
+
+// #include <nlohmann/detail/meta/detected.hpp>
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+template<typename T>
+using null_function_t = decltype(std::declval<T&>().null());
+
+template<typename T>
+using boolean_function_t =
+    decltype(std::declval<T&>().boolean(std::declval<bool>()));
+
+template<typename T, typename Integer>
+using number_integer_function_t =
+    decltype(std::declval<T&>().number_integer(std::declval<Integer>()));
+
+template<typename T, typename Unsigned>
+using number_unsigned_function_t =
+    decltype(std::declval<T&>().number_unsigned(std::declval<Unsigned>()));
+
+template<typename T, typename Float, typename String>
+using number_float_function_t = decltype(std::declval<T&>().number_float(
+                                    std::declval<Float>(), std::declval<const String&>()));
+
+template<typename T, typename String>
+using string_function_t =
+    decltype(std::declval<T&>().string(std::declval<String&>()));
+
+template<typename T, typename Binary>
+using binary_function_t =
+    decltype(std::declval<T&>().binary(std::declval<Binary&>()));
+
+template<typename T>
+using start_object_function_t =
+    decltype(std::declval<T&>().start_object(std::declval<std::size_t>()));
+
+template<typename T, typename String>
+using key_function_t =
+    decltype(std::declval<T&>().key(std::declval<String&>()));
+
+template<typename T>
+using end_object_function_t = decltype(std::declval<T&>().end_object());
+
+template<typename T>
+using start_array_function_t =
+    decltype(std::declval<T&>().start_array(std::declval<std::size_t>()));
+
+template<typename T>
+using end_array_function_t = decltype(std::declval<T&>().end_array());
+
+template<typename T, typename Exception>
+using parse_error_function_t = decltype(std::declval<T&>().parse_error(
+        std::declval<std::size_t>(), std::declval<const std::string&>(),
+        std::declval<const Exception&>()));
+
+template<typename SAX, typename BasicJsonType>
+struct is_sax
+{
+  private:
+    static_assert(is_basic_json<BasicJsonType>::value,
+                  "BasicJsonType must be of type basic_json<...>");
+
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using binary_t = typename BasicJsonType::binary_t;
+    using exception_t = typename BasicJsonType::exception;
+
+  public:
+    static constexpr bool value =
+        is_detected_exact<bool, null_function_t, SAX>::value &&
+        is_detected_exact<bool, boolean_function_t, SAX>::value &&
+        is_detected_exact<bool, number_integer_function_t, SAX, number_integer_t>::value &&
+        is_detected_exact<bool, number_unsigned_function_t, SAX, number_unsigned_t>::value &&
+        is_detected_exact<bool, number_float_function_t, SAX, number_float_t, string_t>::value &&
+        is_detected_exact<bool, string_function_t, SAX, string_t>::value &&
+        is_detected_exact<bool, binary_function_t, SAX, binary_t>::value &&
+        is_detected_exact<bool, start_object_function_t, SAX>::value &&
+        is_detected_exact<bool, key_function_t, SAX, string_t>::value &&
+        is_detected_exact<bool, end_object_function_t, SAX>::value &&
+        is_detected_exact<bool, start_array_function_t, SAX>::value &&
+        is_detected_exact<bool, end_array_function_t, SAX>::value &&
+        is_detected_exact<bool, parse_error_function_t, SAX, exception_t>::value;
+};
+
+template<typename SAX, typename BasicJsonType>
+struct is_sax_static_asserts
+{
+  private:
+    static_assert(is_basic_json<BasicJsonType>::value,
+                  "BasicJsonType must be of type basic_json<...>");
+
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using binary_t = typename BasicJsonType::binary_t;
+    using exception_t = typename BasicJsonType::exception;
+
+  public:
+    static_assert(is_detected_exact<bool, null_function_t, SAX>::value,
+                  "Missing/invalid function: bool null()");
+    static_assert(is_detected_exact<bool, boolean_function_t, SAX>::value,
+                  "Missing/invalid function: bool boolean(bool)");
+    static_assert(is_detected_exact<bool, boolean_function_t, SAX>::value,
+                  "Missing/invalid function: bool boolean(bool)");
+    static_assert(
+        is_detected_exact<bool, number_integer_function_t, SAX,
+        number_integer_t>::value,
+        "Missing/invalid function: bool number_integer(number_integer_t)");
+    static_assert(
+        is_detected_exact<bool, number_unsigned_function_t, SAX,
+        number_unsigned_t>::value,
+        "Missing/invalid function: bool number_unsigned(number_unsigned_t)");
+    static_assert(is_detected_exact<bool, number_float_function_t, SAX,
+                  number_float_t, string_t>::value,
+                  "Missing/invalid function: bool number_float(number_float_t, const string_t&)");
+    static_assert(
+        is_detected_exact<bool, string_function_t, SAX, string_t>::value,
+        "Missing/invalid function: bool string(string_t&)");
+    static_assert(
+        is_detected_exact<bool, binary_function_t, SAX, binary_t>::value,
+        "Missing/invalid function: bool binary(binary_t&)");
+    static_assert(is_detected_exact<bool, start_object_function_t, SAX>::value,
+                  "Missing/invalid function: bool start_object(std::size_t)");
+    static_assert(is_detected_exact<bool, key_function_t, SAX, string_t>::value,
+                  "Missing/invalid function: bool key(string_t&)");
+    static_assert(is_detected_exact<bool, end_object_function_t, SAX>::value,
+                  "Missing/invalid function: bool end_object()");
+    static_assert(is_detected_exact<bool, start_array_function_t, SAX>::value,
+                  "Missing/invalid function: bool start_array(std::size_t)");
+    static_assert(is_detected_exact<bool, end_array_function_t, SAX>::value,
+                  "Missing/invalid function: bool end_array()");
+    static_assert(
+        is_detected_exact<bool, parse_error_function_t, SAX, exception_t>::value,
+        "Missing/invalid function: bool parse_error(std::size_t, const "
+        "std::string&, const exception&)");
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+
+/// how to treat CBOR tags
+enum class cbor_tag_handler_t
+{
+    error,   ///< throw a parse_error exception in case of a tag
+    ignore,  ///< ignore tags
+    store    ///< store tags as binary type
+};
+
+/*!
+@brief determine system byte order
+
+@return true if and only if system's byte order is little endian
+
+@note from https://stackoverflow.com/a/1001328/266378
+*/
+static inline bool little_endianness(int num = 1) noexcept
+{
+    return *reinterpret_cast<char*>(&num) == 1;
+}
+
+
+///////////////////
+// binary reader //
+///////////////////
+
+/*!
+@brief deserialization of CBOR, MessagePack, and UBJSON values
+*/
+template<typename BasicJsonType, typename InputAdapterType, typename SAX = json_sax_dom_parser<BasicJsonType>>
+class binary_reader
+{
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using binary_t = typename BasicJsonType::binary_t;
+    using json_sax_t = SAX;
+    using char_type = typename InputAdapterType::char_type;
+    using char_int_type = typename std::char_traits<char_type>::int_type;
+
+  public:
+    /*!
+    @brief create a binary reader
+
+    @param[in] adapter  input adapter to read from
+    */
+    explicit binary_reader(InputAdapterType&& adapter) noexcept : ia(std::move(adapter))
+    {
+        (void)detail::is_sax_static_asserts<SAX, BasicJsonType> {};
+    }
+
+    // make class move-only
+    binary_reader(const binary_reader&) = delete;
+    binary_reader(binary_reader&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
+    binary_reader& operator=(const binary_reader&) = delete;
+    binary_reader& operator=(binary_reader&&) = default; // NOLINT(hicpp-noexcept-move,performance-noexcept-move-constructor)
+    ~binary_reader() = default;
+
+    /*!
+    @param[in] format  the binary format to parse
+    @param[in] sax_    a SAX event processor
+    @param[in] strict  whether to expect the input to be consumed completed
+    @param[in] tag_handler  how to treat CBOR tags
+
+    @return whether parsing was successful
+    */
+    JSON_HEDLEY_NON_NULL(3)
+    bool sax_parse(const input_format_t format,
+                   json_sax_t* sax_,
+                   const bool strict = true,
+                   const cbor_tag_handler_t tag_handler = cbor_tag_handler_t::error)
+    {
+        sax = sax_;
+        bool result = false;
+
+        switch (format)
+        {
+            case input_format_t::bson:
+                result = parse_bson_internal();
+                break;
+
+            case input_format_t::cbor:
+                result = parse_cbor_internal(true, tag_handler);
+                break;
+
+            case input_format_t::msgpack:
+                result = parse_msgpack_internal();
+                break;
+
+            case input_format_t::ubjson:
+                result = parse_ubjson_internal();
+                break;
+
+            case input_format_t::json: // LCOV_EXCL_LINE
+            default:            // LCOV_EXCL_LINE
+                JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+        }
+
+        // strict mode: next byte must be EOF
+        if (result && strict)
+        {
+            if (format == input_format_t::ubjson)
+            {
+                get_ignore_noop();
+            }
+            else
+            {
+                get();
+            }
+
+            if (JSON_HEDLEY_UNLIKELY(current != std::char_traits<char_type>::eof()))
+            {
+                return sax->parse_error(chars_read, get_token_string(),
+                                        parse_error::create(110, chars_read, exception_message(format, "expected end of input; last byte: 0x" + get_token_string(), "value"), BasicJsonType()));
+            }
+        }
+
+        return result;
+    }
+
+  private:
+    //////////
+    // BSON //
+    //////////
+
+    /*!
+    @brief Reads in a BSON-object and passes it to the SAX-parser.
+    @return whether a valid BSON-value was passed to the SAX parser
+    */
+    bool parse_bson_internal()
+    {
+        std::int32_t document_size{};
+        get_number<std::int32_t, true>(input_format_t::bson, document_size);
+
+        if (JSON_HEDLEY_UNLIKELY(!sax->start_object(static_cast<std::size_t>(-1))))
+        {
+            return false;
+        }
+
+        if (JSON_HEDLEY_UNLIKELY(!parse_bson_element_list(/*is_array*/false)))
+        {
+            return false;
+        }
+
+        return sax->end_object();
+    }
+
+    /*!
+    @brief Parses a C-style string from the BSON input.
+    @param[in,out] result  A reference to the string variable where the read
+                            string is to be stored.
+    @return `true` if the \x00-byte indicating the end of the string was
+             encountered before the EOF; false` indicates an unexpected EOF.
+    */
+    bool get_bson_cstr(string_t& result)
+    {
+        auto out = std::back_inserter(result);
+        while (true)
+        {
+            get();
+            if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::bson, "cstring")))
+            {
+                return false;
+            }
+            if (current == 0x00)
+            {
+                return true;
+            }
+            *out++ = static_cast<typename string_t::value_type>(current);
+        }
+    }
+
+    /*!
+    @brief Parses a zero-terminated string of length @a len from the BSON
+           input.
+    @param[in] len  The length (including the zero-byte at the end) of the
+                    string to be read.
+    @param[in,out] result  A reference to the string variable where the read
+                            string is to be stored.
+    @tparam NumberType The type of the length @a len
+    @pre len >= 1
+    @return `true` if the string was successfully parsed
+    */
+    template<typename NumberType>
+    bool get_bson_string(const NumberType len, string_t& result)
+    {
+        if (JSON_HEDLEY_UNLIKELY(len < 1))
+        {
+            auto last_token = get_token_string();
+            return sax->parse_error(chars_read, last_token, parse_error::create(112, chars_read, exception_message(input_format_t::bson, "string length must be at least 1, is " + std::to_string(len), "string"), BasicJsonType()));
+        }
+
+        return get_string(input_format_t::bson, len - static_cast<NumberType>(1), result) && get() != std::char_traits<char_type>::eof();
+    }
+
+    /*!
+    @brief Parses a byte array input of length @a len from the BSON input.
+    @param[in] len  The length of the byte array to be read.
+    @param[in,out] result  A reference to the binary variable where the read
+                            array is to be stored.
+    @tparam NumberType The type of the length @a len
+    @pre len >= 0
+    @return `true` if the byte array was successfully parsed
+    */
+    template<typename NumberType>
+    bool get_bson_binary(const NumberType len, binary_t& result)
+    {
+        if (JSON_HEDLEY_UNLIKELY(len < 0))
+        {
+            auto last_token = get_token_string();
+            return sax->parse_error(chars_read, last_token, parse_error::create(112, chars_read, exception_message(input_format_t::bson, "byte array length cannot be negative, is " + std::to_string(len), "binary"), BasicJsonType()));
+        }
+
+        // All BSON binary values have a subtype
+        std::uint8_t subtype{};
+        get_number<std::uint8_t>(input_format_t::bson, subtype);
+        result.set_subtype(subtype);
+
+        return get_binary(input_format_t::bson, len, result);
+    }
+
+    /*!
+    @brief Read a BSON document element of the given @a element_type.
+    @param[in] element_type The BSON element type, c.f. http://bsonspec.org/spec.html
+    @param[in] element_type_parse_position The position in the input stream,
+               where the `element_type` was read.
+    @warning Not all BSON element types are supported yet. An unsupported
+             @a element_type will give rise to a parse_error.114:
+             Unsupported BSON record type 0x...
+    @return whether a valid BSON-object/array was passed to the SAX parser
+    */
+    bool parse_bson_element_internal(const char_int_type element_type,
+                                     const std::size_t element_type_parse_position)
+    {
+        switch (element_type)
+        {
+            case 0x01: // double
+            {
+                double number{};
+                return get_number<double, true>(input_format_t::bson, number) && sax->number_float(static_cast<number_float_t>(number), "");
+            }
+
+            case 0x02: // string
+            {
+                std::int32_t len{};
+                string_t value;
+                return get_number<std::int32_t, true>(input_format_t::bson, len) && get_bson_string(len, value) && sax->string(value);
+            }
+
+            case 0x03: // object
+            {
+                return parse_bson_internal();
+            }
+
+            case 0x04: // array
+            {
+                return parse_bson_array();
+            }
+
+            case 0x05: // binary
+            {
+                std::int32_t len{};
+                binary_t value;
+                return get_number<std::int32_t, true>(input_format_t::bson, len) && get_bson_binary(len, value) && sax->binary(value);
+            }
+
+            case 0x08: // boolean
+            {
+                return sax->boolean(get() != 0);
+            }
+
+            case 0x0A: // null
+            {
+                return sax->null();
+            }
+
+            case 0x10: // int32
+            {
+                std::int32_t value{};
+                return get_number<std::int32_t, true>(input_format_t::bson, value) && sax->number_integer(value);
+            }
+
+            case 0x12: // int64
+            {
+                std::int64_t value{};
+                return get_number<std::int64_t, true>(input_format_t::bson, value) && sax->number_integer(value);
+            }
+
+            default: // anything else not supported (yet)
+            {
+                std::array<char, 3> cr{{}};
+                static_cast<void>((std::snprintf)(cr.data(), cr.size(), "%.2hhX", static_cast<unsigned char>(element_type))); // NOLINT(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+                return sax->parse_error(element_type_parse_position, std::string(cr.data()), parse_error::create(114, element_type_parse_position, "Unsupported BSON record type 0x" + std::string(cr.data()), BasicJsonType()));
+            }
+        }
+    }
+
+    /*!
+    @brief Read a BSON element list (as specified in the BSON-spec)
+
+    The same binary layout is used for objects and arrays, hence it must be
+    indicated with the argument @a is_array which one is expected
+    (true --> array, false --> object).
+
+    @param[in] is_array Determines if the element list being read is to be
+                        treated as an object (@a is_array == false), or as an
+                        array (@a is_array == true).
+    @return whether a valid BSON-object/array was passed to the SAX parser
+    */
+    bool parse_bson_element_list(const bool is_array)
+    {
+        string_t key;
+
+        while (auto element_type = get())
+        {
+            if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::bson, "element list")))
+            {
+                return false;
+            }
+
+            const std::size_t element_type_parse_position = chars_read;
+            if (JSON_HEDLEY_UNLIKELY(!get_bson_cstr(key)))
+            {
+                return false;
+            }
+
+            if (!is_array && !sax->key(key))
+            {
+                return false;
+            }
+
+            if (JSON_HEDLEY_UNLIKELY(!parse_bson_element_internal(element_type, element_type_parse_position)))
+            {
+                return false;
+            }
+
+            // get_bson_cstr only appends
+            key.clear();
+        }
+
+        return true;
+    }
+
+    /*!
+    @brief Reads an array from the BSON input and passes it to the SAX-parser.
+    @return whether a valid BSON-array was passed to the SAX parser
+    */
+    bool parse_bson_array()
+    {
+        std::int32_t document_size{};
+        get_number<std::int32_t, true>(input_format_t::bson, document_size);
+
+        if (JSON_HEDLEY_UNLIKELY(!sax->start_array(static_cast<std::size_t>(-1))))
+        {
+            return false;
+        }
+
+        if (JSON_HEDLEY_UNLIKELY(!parse_bson_element_list(/*is_array*/true)))
+        {
+            return false;
+        }
+
+        return sax->end_array();
+    }
+
+    //////////
+    // CBOR //
+    //////////
+
+    /*!
+    @param[in] get_char  whether a new character should be retrieved from the
+                         input (true) or whether the last read character should
+                         be considered instead (false)
+    @param[in] tag_handler how CBOR tags should be treated
+
+    @return whether a valid CBOR value was passed to the SAX parser
+    */
+    bool parse_cbor_internal(const bool get_char,
+                             const cbor_tag_handler_t tag_handler)
+    {
+        switch (get_char ? get() : current)
+        {
+            // EOF
+            case std::char_traits<char_type>::eof():
+                return unexpect_eof(input_format_t::cbor, "value");
+
+            // Integer 0x00..0x17 (0..23)
+            case 0x00:
+            case 0x01:
+            case 0x02:
+            case 0x03:
+            case 0x04:
+            case 0x05:
+            case 0x06:
+            case 0x07:
+            case 0x08:
+            case 0x09:
+            case 0x0A:
+            case 0x0B:
+            case 0x0C:
+            case 0x0D:
+            case 0x0E:
+            case 0x0F:
+            case 0x10:
+            case 0x11:
+            case 0x12:
+            case 0x13:
+            case 0x14:
+            case 0x15:
+            case 0x16:
+            case 0x17:
+                return sax->number_unsigned(static_cast<number_unsigned_t>(current));
+
+            case 0x18: // Unsigned integer (one-byte uint8_t follows)
+            {
+                std::uint8_t number{};
+                return get_number(input_format_t::cbor, number) && sax->number_unsigned(number);
+            }
+
+            case 0x19: // Unsigned integer (two-byte uint16_t follows)
+            {
+                std::uint16_t number{};
+                return get_number(input_format_t::cbor, number) && sax->number_unsigned(number);
+            }
+
+            case 0x1A: // Unsigned integer (four-byte uint32_t follows)
+            {
+                std::uint32_t number{};
+                return get_number(input_format_t::cbor, number) && sax->number_unsigned(number);
+            }
+
+            case 0x1B: // Unsigned integer (eight-byte uint64_t follows)
+            {
+                std::uint64_t number{};
+                return get_number(input_format_t::cbor, number) && sax->number_unsigned(number);
+            }
+
+            // Negative integer -1-0x00..-1-0x17 (-1..-24)
+            case 0x20:
+            case 0x21:
+            case 0x22:
+            case 0x23:
+            case 0x24:
+            case 0x25:
+            case 0x26:
+            case 0x27:
+            case 0x28:
+            case 0x29:
+            case 0x2A:
+            case 0x2B:
+            case 0x2C:
+            case 0x2D:
+            case 0x2E:
+            case 0x2F:
+            case 0x30:
+            case 0x31:
+            case 0x32:
+            case 0x33:
+            case 0x34:
+            case 0x35:
+            case 0x36:
+            case 0x37:
+                return sax->number_integer(static_cast<std::int8_t>(0x20 - 1 - current));
+
+            case 0x38: // Negative integer (one-byte uint8_t follows)
+            {
+                std::uint8_t number{};
+                return get_number(input_format_t::cbor, number) && sax->number_integer(static_cast<number_integer_t>(-1) - number);
+            }
+
+            case 0x39: // Negative integer -1-n (two-byte uint16_t follows)
+            {
+                std::uint16_t number{};
+                return get_number(input_format_t::cbor, number) && sax->number_integer(static_cast<number_integer_t>(-1) - number);
+            }
+
+            case 0x3A: // Negative integer -1-n (four-byte uint32_t follows)
+            {
+                std::uint32_t number{};
+                return get_number(input_format_t::cbor, number) && sax->number_integer(static_cast<number_integer_t>(-1) - number);
+            }
+
+            case 0x3B: // Negative integer -1-n (eight-byte uint64_t follows)
+            {
+                std::uint64_t number{};
+                return get_number(input_format_t::cbor, number) && sax->number_integer(static_cast<number_integer_t>(-1)
+                        - static_cast<number_integer_t>(number));
+            }
+
+            // Binary data (0x00..0x17 bytes follow)
+            case 0x40:
+            case 0x41:
+            case 0x42:
+            case 0x43:
+            case 0x44:
+            case 0x45:
+            case 0x46:
+            case 0x47:
+            case 0x48:
+            case 0x49:
+            case 0x4A:
+            case 0x4B:
+            case 0x4C:
+            case 0x4D:
+            case 0x4E:
+            case 0x4F:
+            case 0x50:
+            case 0x51:
+            case 0x52:
+            case 0x53:
+            case 0x54:
+            case 0x55:
+            case 0x56:
+            case 0x57:
+            case 0x58: // Binary data (one-byte uint8_t for n follows)
+            case 0x59: // Binary data (two-byte uint16_t for n follow)
+            case 0x5A: // Binary data (four-byte uint32_t for n follow)
+            case 0x5B: // Binary data (eight-byte uint64_t for n follow)
+            case 0x5F: // Binary data (indefinite length)
+            {
+                binary_t b;
+                return get_cbor_binary(b) && sax->binary(b);
+            }
+
+            // UTF-8 string (0x00..0x17 bytes follow)
+            case 0x60:
+            case 0x61:
+            case 0x62:
+            case 0x63:
+            case 0x64:
+            case 0x65:
+            case 0x66:
+            case 0x67:
+            case 0x68:
+            case 0x69:
+            case 0x6A:
+            case 0x6B:
+            case 0x6C:
+            case 0x6D:
+            case 0x6E:
+            case 0x6F:
+            case 0x70:
+            case 0x71:
+            case 0x72:
+            case 0x73:
+            case 0x74:
+            case 0x75:
+            case 0x76:
+            case 0x77:
+            case 0x78: // UTF-8 string (one-byte uint8_t for n follows)
+            case 0x79: // UTF-8 string (two-byte uint16_t for n follow)
+            case 0x7A: // UTF-8 string (four-byte uint32_t for n follow)
+            case 0x7B: // UTF-8 string (eight-byte uint64_t for n follow)
+            case 0x7F: // UTF-8 string (indefinite length)
+            {
+                string_t s;
+                return get_cbor_string(s) && sax->string(s);
+            }
+
+            // array (0x00..0x17 data items follow)
+            case 0x80:
+            case 0x81:
+            case 0x82:
+            case 0x83:
+            case 0x84:
+            case 0x85:
+            case 0x86:
+            case 0x87:
+            case 0x88:
+            case 0x89:
+            case 0x8A:
+            case 0x8B:
+            case 0x8C:
+            case 0x8D:
+            case 0x8E:
+            case 0x8F:
+            case 0x90:
+            case 0x91:
+            case 0x92:
+            case 0x93:
+            case 0x94:
+            case 0x95:
+            case 0x96:
+            case 0x97:
+                return get_cbor_array(static_cast<std::size_t>(static_cast<unsigned int>(current) & 0x1Fu), tag_handler);
+
+            case 0x98: // array (one-byte uint8_t for n follows)
+            {
+                std::uint8_t len{};
+                return get_number(input_format_t::cbor, len) && get_cbor_array(static_cast<std::size_t>(len), tag_handler);
+            }
+
+            case 0x99: // array (two-byte uint16_t for n follow)
+            {
+                std::uint16_t len{};
+                return get_number(input_format_t::cbor, len) && get_cbor_array(static_cast<std::size_t>(len), tag_handler);
+            }
+
+            case 0x9A: // array (four-byte uint32_t for n follow)
+            {
+                std::uint32_t len{};
+                return get_number(input_format_t::cbor, len) && get_cbor_array(static_cast<std::size_t>(len), tag_handler);
+            }
+
+            case 0x9B: // array (eight-byte uint64_t for n follow)
+            {
+                std::uint64_t len{};
+                return get_number(input_format_t::cbor, len) && get_cbor_array(detail::conditional_static_cast<std::size_t>(len), tag_handler);
+            }
+
+            case 0x9F: // array (indefinite length)
+                return get_cbor_array(static_cast<std::size_t>(-1), tag_handler);
+
+            // map (0x00..0x17 pairs of data items follow)
+            case 0xA0:
+            case 0xA1:
+            case 0xA2:
+            case 0xA3:
+            case 0xA4:
+            case 0xA5:
+            case 0xA6:
+            case 0xA7:
+            case 0xA8:
+            case 0xA9:
+            case 0xAA:
+            case 0xAB:
+            case 0xAC:
+            case 0xAD:
+            case 0xAE:
+            case 0xAF:
+            case 0xB0:
+            case 0xB1:
+            case 0xB2:
+            case 0xB3:
+            case 0xB4:
+            case 0xB5:
+            case 0xB6:
+            case 0xB7:
+                return get_cbor_object(static_cast<std::size_t>(static_cast<unsigned int>(current) & 0x1Fu), tag_handler);
+
+            case 0xB8: // map (one-byte uint8_t for n follows)
+            {
+                std::uint8_t len{};
+                return get_number(input_format_t::cbor, len) && get_cbor_object(static_cast<std::size_t>(len), tag_handler);
+            }
+
+            case 0xB9: // map (two-byte uint16_t for n follow)
+            {
+                std::uint16_t len{};
+                return get_number(input_format_t::cbor, len) && get_cbor_object(static_cast<std::size_t>(len), tag_handler);
+            }
+
+            case 0xBA: // map (four-byte uint32_t for n follow)
+            {
+                std::uint32_t len{};
+                return get_number(input_format_t::cbor, len) && get_cbor_object(static_cast<std::size_t>(len), tag_handler);
+            }
+
+            case 0xBB: // map (eight-byte uint64_t for n follow)
+            {
+                std::uint64_t len{};
+                return get_number(input_format_t::cbor, len) && get_cbor_object(detail::conditional_static_cast<std::size_t>(len), tag_handler);
+            }
+
+            case 0xBF: // map (indefinite length)
+                return get_cbor_object(static_cast<std::size_t>(-1), tag_handler);
+
+            case 0xC6: // tagged item
+            case 0xC7:
+            case 0xC8:
+            case 0xC9:
+            case 0xCA:
+            case 0xCB:
+            case 0xCC:
+            case 0xCD:
+            case 0xCE:
+            case 0xCF:
+            case 0xD0:
+            case 0xD1:
+            case 0xD2:
+            case 0xD3:
+            case 0xD4:
+            case 0xD8: // tagged item (1 bytes follow)
+            case 0xD9: // tagged item (2 bytes follow)
+            case 0xDA: // tagged item (4 bytes follow)
+            case 0xDB: // tagged item (8 bytes follow)
+            {
+                switch (tag_handler)
+                {
+                    case cbor_tag_handler_t::error:
+                    {
+                        auto last_token = get_token_string();
+                        return sax->parse_error(chars_read, last_token, parse_error::create(112, chars_read, exception_message(input_format_t::cbor, "invalid byte: 0x" + last_token, "value"), BasicJsonType()));
+                    }
+
+                    case cbor_tag_handler_t::ignore:
+                    {
+                        // ignore binary subtype
+                        switch (current)
+                        {
+                            case 0xD8:
+                            {
+                                std::uint8_t subtype_to_ignore{};
+                                get_number(input_format_t::cbor, subtype_to_ignore);
+                                break;
+                            }
+                            case 0xD9:
+                            {
+                                std::uint16_t subtype_to_ignore{};
+                                get_number(input_format_t::cbor, subtype_to_ignore);
+                                break;
+                            }
+                            case 0xDA:
+                            {
+                                std::uint32_t subtype_to_ignore{};
+                                get_number(input_format_t::cbor, subtype_to_ignore);
+                                break;
+                            }
+                            case 0xDB:
+                            {
+                                std::uint64_t subtype_to_ignore{};
+                                get_number(input_format_t::cbor, subtype_to_ignore);
+                                break;
+                            }
+                            default:
+                                break;
+                        }
+                        return parse_cbor_internal(true, tag_handler);
+                    }
+
+                    case cbor_tag_handler_t::store:
+                    {
+                        binary_t b;
+                        // use binary subtype and store in binary container
+                        switch (current)
+                        {
+                            case 0xD8:
+                            {
+                                std::uint8_t subtype{};
+                                get_number(input_format_t::cbor, subtype);
+                                b.set_subtype(detail::conditional_static_cast<typename binary_t::subtype_type>(subtype));
+                                break;
+                            }
+                            case 0xD9:
+                            {
+                                std::uint16_t subtype{};
+                                get_number(input_format_t::cbor, subtype);
+                                b.set_subtype(detail::conditional_static_cast<typename binary_t::subtype_type>(subtype));
+                                break;
+                            }
+                            case 0xDA:
+                            {
+                                std::uint32_t subtype{};
+                                get_number(input_format_t::cbor, subtype);
+                                b.set_subtype(detail::conditional_static_cast<typename binary_t::subtype_type>(subtype));
+                                break;
+                            }
+                            case 0xDB:
+                            {
+                                std::uint64_t subtype{};
+                                get_number(input_format_t::cbor, subtype);
+                                b.set_subtype(detail::conditional_static_cast<typename binary_t::subtype_type>(subtype));
+                                break;
+                            }
+                            default:
+                                return parse_cbor_internal(true, tag_handler);
+                        }
+                        get();
+                        return get_cbor_binary(b) && sax->binary(b);
+                    }
+
+                    default:                 // LCOV_EXCL_LINE
+                        JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+                        return false;        // LCOV_EXCL_LINE
+                }
+            }
+
+            case 0xF4: // false
+                return sax->boolean(false);
+
+            case 0xF5: // true
+                return sax->boolean(true);
+
+            case 0xF6: // null
+                return sax->null();
+
+            case 0xF9: // Half-Precision Float (two-byte IEEE 754)
+            {
+                const auto byte1_raw = get();
+                if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::cbor, "number")))
+                {
+                    return false;
+                }
+                const auto byte2_raw = get();
+                if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::cbor, "number")))
+                {
+                    return false;
+                }
+
+                const auto byte1 = static_cast<unsigned char>(byte1_raw);
+                const auto byte2 = static_cast<unsigned char>(byte2_raw);
+
+                // code from RFC 7049, Appendix D, Figure 3:
+                // As half-precision floating-point numbers were only added
+                // to IEEE 754 in 2008, today's programming platforms often
+                // still only have limited support for them. It is very
+                // easy to include at least decoding support for them even
+                // without such support. An example of a small decoder for
+                // half-precision floating-point numbers in the C language
+                // is shown in Fig. 3.
+                const auto half = static_cast<unsigned int>((byte1 << 8u) + byte2);
+                const double val = [&half]
+                {
+                    const int exp = (half >> 10u) & 0x1Fu;
+                    const unsigned int mant = half & 0x3FFu;
+                    JSON_ASSERT(0 <= exp&& exp <= 32);
+                    JSON_ASSERT(mant <= 1024);
+                    switch (exp)
+                    {
+                        case 0:
+                            return std::ldexp(mant, -24);
+                        case 31:
+                            return (mant == 0)
+                            ? std::numeric_limits<double>::infinity()
+                            : std::numeric_limits<double>::quiet_NaN();
+                        default:
+                            return std::ldexp(mant + 1024, exp - 25);
+                    }
+                }();
+                return sax->number_float((half & 0x8000u) != 0
+                                         ? static_cast<number_float_t>(-val)
+                                         : static_cast<number_float_t>(val), "");
+            }
+
+            case 0xFA: // Single-Precision Float (four-byte IEEE 754)
+            {
+                float number{};
+                return get_number(input_format_t::cbor, number) && sax->number_float(static_cast<number_float_t>(number), "");
+            }
+
+            case 0xFB: // Double-Precision Float (eight-byte IEEE 754)
+            {
+                double number{};
+                return get_number(input_format_t::cbor, number) && sax->number_float(static_cast<number_float_t>(number), "");
+            }
+
+            default: // anything else (0xFF is handled inside the other types)
+            {
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(112, chars_read, exception_message(input_format_t::cbor, "invalid byte: 0x" + last_token, "value"), BasicJsonType()));
+            }
+        }
+    }
+
+    /*!
+    @brief reads a CBOR string
+
+    This function first reads starting bytes to determine the expected
+    string length and then copies this number of bytes into a string.
+    Additionally, CBOR's strings with indefinite lengths are supported.
+
+    @param[out] result  created string
+
+    @return whether string creation completed
+    */
+    bool get_cbor_string(string_t& result)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::cbor, "string")))
+        {
+            return false;
+        }
+
+        switch (current)
+        {
+            // UTF-8 string (0x00..0x17 bytes follow)
+            case 0x60:
+            case 0x61:
+            case 0x62:
+            case 0x63:
+            case 0x64:
+            case 0x65:
+            case 0x66:
+            case 0x67:
+            case 0x68:
+            case 0x69:
+            case 0x6A:
+            case 0x6B:
+            case 0x6C:
+            case 0x6D:
+            case 0x6E:
+            case 0x6F:
+            case 0x70:
+            case 0x71:
+            case 0x72:
+            case 0x73:
+            case 0x74:
+            case 0x75:
+            case 0x76:
+            case 0x77:
+            {
+                return get_string(input_format_t::cbor, static_cast<unsigned int>(current) & 0x1Fu, result);
+            }
+
+            case 0x78: // UTF-8 string (one-byte uint8_t for n follows)
+            {
+                std::uint8_t len{};
+                return get_number(input_format_t::cbor, len) && get_string(input_format_t::cbor, len, result);
+            }
+
+            case 0x79: // UTF-8 string (two-byte uint16_t for n follow)
+            {
+                std::uint16_t len{};
+                return get_number(input_format_t::cbor, len) && get_string(input_format_t::cbor, len, result);
+            }
+
+            case 0x7A: // UTF-8 string (four-byte uint32_t for n follow)
+            {
+                std::uint32_t len{};
+                return get_number(input_format_t::cbor, len) && get_string(input_format_t::cbor, len, result);
+            }
+
+            case 0x7B: // UTF-8 string (eight-byte uint64_t for n follow)
+            {
+                std::uint64_t len{};
+                return get_number(input_format_t::cbor, len) && get_string(input_format_t::cbor, len, result);
+            }
+
+            case 0x7F: // UTF-8 string (indefinite length)
+            {
+                while (get() != 0xFF)
+                {
+                    string_t chunk;
+                    if (!get_cbor_string(chunk))
+                    {
+                        return false;
+                    }
+                    result.append(chunk);
+                }
+                return true;
+            }
+
+            default:
+            {
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(113, chars_read, exception_message(input_format_t::cbor, "expected length specification (0x60-0x7B) or indefinite string type (0x7F); last byte: 0x" + last_token, "string"), BasicJsonType()));
+            }
+        }
+    }
+
+    /*!
+    @brief reads a CBOR byte array
+
+    This function first reads starting bytes to determine the expected
+    byte array length and then copies this number of bytes into the byte array.
+    Additionally, CBOR's byte arrays with indefinite lengths are supported.
+
+    @param[out] result  created byte array
+
+    @return whether byte array creation completed
+    */
+    bool get_cbor_binary(binary_t& result)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::cbor, "binary")))
+        {
+            return false;
+        }
+
+        switch (current)
+        {
+            // Binary data (0x00..0x17 bytes follow)
+            case 0x40:
+            case 0x41:
+            case 0x42:
+            case 0x43:
+            case 0x44:
+            case 0x45:
+            case 0x46:
+            case 0x47:
+            case 0x48:
+            case 0x49:
+            case 0x4A:
+            case 0x4B:
+            case 0x4C:
+            case 0x4D:
+            case 0x4E:
+            case 0x4F:
+            case 0x50:
+            case 0x51:
+            case 0x52:
+            case 0x53:
+            case 0x54:
+            case 0x55:
+            case 0x56:
+            case 0x57:
+            {
+                return get_binary(input_format_t::cbor, static_cast<unsigned int>(current) & 0x1Fu, result);
+            }
+
+            case 0x58: // Binary data (one-byte uint8_t for n follows)
+            {
+                std::uint8_t len{};
+                return get_number(input_format_t::cbor, len) &&
+                       get_binary(input_format_t::cbor, len, result);
+            }
+
+            case 0x59: // Binary data (two-byte uint16_t for n follow)
+            {
+                std::uint16_t len{};
+                return get_number(input_format_t::cbor, len) &&
+                       get_binary(input_format_t::cbor, len, result);
+            }
+
+            case 0x5A: // Binary data (four-byte uint32_t for n follow)
+            {
+                std::uint32_t len{};
+                return get_number(input_format_t::cbor, len) &&
+                       get_binary(input_format_t::cbor, len, result);
+            }
+
+            case 0x5B: // Binary data (eight-byte uint64_t for n follow)
+            {
+                std::uint64_t len{};
+                return get_number(input_format_t::cbor, len) &&
+                       get_binary(input_format_t::cbor, len, result);
+            }
+
+            case 0x5F: // Binary data (indefinite length)
+            {
+                while (get() != 0xFF)
+                {
+                    binary_t chunk;
+                    if (!get_cbor_binary(chunk))
+                    {
+                        return false;
+                    }
+                    result.insert(result.end(), chunk.begin(), chunk.end());
+                }
+                return true;
+            }
+
+            default:
+            {
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(113, chars_read, exception_message(input_format_t::cbor, "expected length specification (0x40-0x5B) or indefinite binary array type (0x5F); last byte: 0x" + last_token, "binary"), BasicJsonType()));
+            }
+        }
+    }
+
+    /*!
+    @param[in] len  the length of the array or static_cast<std::size_t>(-1) for an
+                    array of indefinite size
+    @param[in] tag_handler how CBOR tags should be treated
+    @return whether array creation completed
+    */
+    bool get_cbor_array(const std::size_t len,
+                        const cbor_tag_handler_t tag_handler)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!sax->start_array(len)))
+        {
+            return false;
+        }
+
+        if (len != static_cast<std::size_t>(-1))
+        {
+            for (std::size_t i = 0; i < len; ++i)
+            {
+                if (JSON_HEDLEY_UNLIKELY(!parse_cbor_internal(true, tag_handler)))
+                {
+                    return false;
+                }
+            }
+        }
+        else
+        {
+            while (get() != 0xFF)
+            {
+                if (JSON_HEDLEY_UNLIKELY(!parse_cbor_internal(false, tag_handler)))
+                {
+                    return false;
+                }
+            }
+        }
+
+        return sax->end_array();
+    }
+
+    /*!
+    @param[in] len  the length of the object or static_cast<std::size_t>(-1) for an
+                    object of indefinite size
+    @param[in] tag_handler how CBOR tags should be treated
+    @return whether object creation completed
+    */
+    bool get_cbor_object(const std::size_t len,
+                         const cbor_tag_handler_t tag_handler)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!sax->start_object(len)))
+        {
+            return false;
+        }
+
+        if (len != 0)
+        {
+            string_t key;
+            if (len != static_cast<std::size_t>(-1))
+            {
+                for (std::size_t i = 0; i < len; ++i)
+                {
+                    get();
+                    if (JSON_HEDLEY_UNLIKELY(!get_cbor_string(key) || !sax->key(key)))
+                    {
+                        return false;
+                    }
+
+                    if (JSON_HEDLEY_UNLIKELY(!parse_cbor_internal(true, tag_handler)))
+                    {
+                        return false;
+                    }
+                    key.clear();
+                }
+            }
+            else
+            {
+                while (get() != 0xFF)
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!get_cbor_string(key) || !sax->key(key)))
+                    {
+                        return false;
+                    }
+
+                    if (JSON_HEDLEY_UNLIKELY(!parse_cbor_internal(true, tag_handler)))
+                    {
+                        return false;
+                    }
+                    key.clear();
+                }
+            }
+        }
+
+        return sax->end_object();
+    }
+
+    /////////////
+    // MsgPack //
+    /////////////
+
+    /*!
+    @return whether a valid MessagePack value was passed to the SAX parser
+    */
+    bool parse_msgpack_internal()
+    {
+        switch (get())
+        {
+            // EOF
+            case std::char_traits<char_type>::eof():
+                return unexpect_eof(input_format_t::msgpack, "value");
+
+            // positive fixint
+            case 0x00:
+            case 0x01:
+            case 0x02:
+            case 0x03:
+            case 0x04:
+            case 0x05:
+            case 0x06:
+            case 0x07:
+            case 0x08:
+            case 0x09:
+            case 0x0A:
+            case 0x0B:
+            case 0x0C:
+            case 0x0D:
+            case 0x0E:
+            case 0x0F:
+            case 0x10:
+            case 0x11:
+            case 0x12:
+            case 0x13:
+            case 0x14:
+            case 0x15:
+            case 0x16:
+            case 0x17:
+            case 0x18:
+            case 0x19:
+            case 0x1A:
+            case 0x1B:
+            case 0x1C:
+            case 0x1D:
+            case 0x1E:
+            case 0x1F:
+            case 0x20:
+            case 0x21:
+            case 0x22:
+            case 0x23:
+            case 0x24:
+            case 0x25:
+            case 0x26:
+            case 0x27:
+            case 0x28:
+            case 0x29:
+            case 0x2A:
+            case 0x2B:
+            case 0x2C:
+            case 0x2D:
+            case 0x2E:
+            case 0x2F:
+            case 0x30:
+            case 0x31:
+            case 0x32:
+            case 0x33:
+            case 0x34:
+            case 0x35:
+            case 0x36:
+            case 0x37:
+            case 0x38:
+            case 0x39:
+            case 0x3A:
+            case 0x3B:
+            case 0x3C:
+            case 0x3D:
+            case 0x3E:
+            case 0x3F:
+            case 0x40:
+            case 0x41:
+            case 0x42:
+            case 0x43:
+            case 0x44:
+            case 0x45:
+            case 0x46:
+            case 0x47:
+            case 0x48:
+            case 0x49:
+            case 0x4A:
+            case 0x4B:
+            case 0x4C:
+            case 0x4D:
+            case 0x4E:
+            case 0x4F:
+            case 0x50:
+            case 0x51:
+            case 0x52:
+            case 0x53:
+            case 0x54:
+            case 0x55:
+            case 0x56:
+            case 0x57:
+            case 0x58:
+            case 0x59:
+            case 0x5A:
+            case 0x5B:
+            case 0x5C:
+            case 0x5D:
+            case 0x5E:
+            case 0x5F:
+            case 0x60:
+            case 0x61:
+            case 0x62:
+            case 0x63:
+            case 0x64:
+            case 0x65:
+            case 0x66:
+            case 0x67:
+            case 0x68:
+            case 0x69:
+            case 0x6A:
+            case 0x6B:
+            case 0x6C:
+            case 0x6D:
+            case 0x6E:
+            case 0x6F:
+            case 0x70:
+            case 0x71:
+            case 0x72:
+            case 0x73:
+            case 0x74:
+            case 0x75:
+            case 0x76:
+            case 0x77:
+            case 0x78:
+            case 0x79:
+            case 0x7A:
+            case 0x7B:
+            case 0x7C:
+            case 0x7D:
+            case 0x7E:
+            case 0x7F:
+                return sax->number_unsigned(static_cast<number_unsigned_t>(current));
+
+            // fixmap
+            case 0x80:
+            case 0x81:
+            case 0x82:
+            case 0x83:
+            case 0x84:
+            case 0x85:
+            case 0x86:
+            case 0x87:
+            case 0x88:
+            case 0x89:
+            case 0x8A:
+            case 0x8B:
+            case 0x8C:
+            case 0x8D:
+            case 0x8E:
+            case 0x8F:
+                return get_msgpack_object(static_cast<std::size_t>(static_cast<unsigned int>(current) & 0x0Fu));
+
+            // fixarray
+            case 0x90:
+            case 0x91:
+            case 0x92:
+            case 0x93:
+            case 0x94:
+            case 0x95:
+            case 0x96:
+            case 0x97:
+            case 0x98:
+            case 0x99:
+            case 0x9A:
+            case 0x9B:
+            case 0x9C:
+            case 0x9D:
+            case 0x9E:
+            case 0x9F:
+                return get_msgpack_array(static_cast<std::size_t>(static_cast<unsigned int>(current) & 0x0Fu));
+
+            // fixstr
+            case 0xA0:
+            case 0xA1:
+            case 0xA2:
+            case 0xA3:
+            case 0xA4:
+            case 0xA5:
+            case 0xA6:
+            case 0xA7:
+            case 0xA8:
+            case 0xA9:
+            case 0xAA:
+            case 0xAB:
+            case 0xAC:
+            case 0xAD:
+            case 0xAE:
+            case 0xAF:
+            case 0xB0:
+            case 0xB1:
+            case 0xB2:
+            case 0xB3:
+            case 0xB4:
+            case 0xB5:
+            case 0xB6:
+            case 0xB7:
+            case 0xB8:
+            case 0xB9:
+            case 0xBA:
+            case 0xBB:
+            case 0xBC:
+            case 0xBD:
+            case 0xBE:
+            case 0xBF:
+            case 0xD9: // str 8
+            case 0xDA: // str 16
+            case 0xDB: // str 32
+            {
+                string_t s;
+                return get_msgpack_string(s) && sax->string(s);
+            }
+
+            case 0xC0: // nil
+                return sax->null();
+
+            case 0xC2: // false
+                return sax->boolean(false);
+
+            case 0xC3: // true
+                return sax->boolean(true);
+
+            case 0xC4: // bin 8
+            case 0xC5: // bin 16
+            case 0xC6: // bin 32
+            case 0xC7: // ext 8
+            case 0xC8: // ext 16
+            case 0xC9: // ext 32
+            case 0xD4: // fixext 1
+            case 0xD5: // fixext 2
+            case 0xD6: // fixext 4
+            case 0xD7: // fixext 8
+            case 0xD8: // fixext 16
+            {
+                binary_t b;
+                return get_msgpack_binary(b) && sax->binary(b);
+            }
+
+            case 0xCA: // float 32
+            {
+                float number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_float(static_cast<number_float_t>(number), "");
+            }
+
+            case 0xCB: // float 64
+            {
+                double number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_float(static_cast<number_float_t>(number), "");
+            }
+
+            case 0xCC: // uint 8
+            {
+                std::uint8_t number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_unsigned(number);
+            }
+
+            case 0xCD: // uint 16
+            {
+                std::uint16_t number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_unsigned(number);
+            }
+
+            case 0xCE: // uint 32
+            {
+                std::uint32_t number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_unsigned(number);
+            }
+
+            case 0xCF: // uint 64
+            {
+                std::uint64_t number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_unsigned(number);
+            }
+
+            case 0xD0: // int 8
+            {
+                std::int8_t number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_integer(number);
+            }
+
+            case 0xD1: // int 16
+            {
+                std::int16_t number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_integer(number);
+            }
+
+            case 0xD2: // int 32
+            {
+                std::int32_t number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_integer(number);
+            }
+
+            case 0xD3: // int 64
+            {
+                std::int64_t number{};
+                return get_number(input_format_t::msgpack, number) && sax->number_integer(number);
+            }
+
+            case 0xDC: // array 16
+            {
+                std::uint16_t len{};
+                return get_number(input_format_t::msgpack, len) && get_msgpack_array(static_cast<std::size_t>(len));
+            }
+
+            case 0xDD: // array 32
+            {
+                std::uint32_t len{};
+                return get_number(input_format_t::msgpack, len) && get_msgpack_array(static_cast<std::size_t>(len));
+            }
+
+            case 0xDE: // map 16
+            {
+                std::uint16_t len{};
+                return get_number(input_format_t::msgpack, len) && get_msgpack_object(static_cast<std::size_t>(len));
+            }
+
+            case 0xDF: // map 32
+            {
+                std::uint32_t len{};
+                return get_number(input_format_t::msgpack, len) && get_msgpack_object(static_cast<std::size_t>(len));
+            }
+
+            // negative fixint
+            case 0xE0:
+            case 0xE1:
+            case 0xE2:
+            case 0xE3:
+            case 0xE4:
+            case 0xE5:
+            case 0xE6:
+            case 0xE7:
+            case 0xE8:
+            case 0xE9:
+            case 0xEA:
+            case 0xEB:
+            case 0xEC:
+            case 0xED:
+            case 0xEE:
+            case 0xEF:
+            case 0xF0:
+            case 0xF1:
+            case 0xF2:
+            case 0xF3:
+            case 0xF4:
+            case 0xF5:
+            case 0xF6:
+            case 0xF7:
+            case 0xF8:
+            case 0xF9:
+            case 0xFA:
+            case 0xFB:
+            case 0xFC:
+            case 0xFD:
+            case 0xFE:
+            case 0xFF:
+                return sax->number_integer(static_cast<std::int8_t>(current));
+
+            default: // anything else
+            {
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(112, chars_read, exception_message(input_format_t::msgpack, "invalid byte: 0x" + last_token, "value"), BasicJsonType()));
+            }
+        }
+    }
+
+    /*!
+    @brief reads a MessagePack string
+
+    This function first reads starting bytes to determine the expected
+    string length and then copies this number of bytes into a string.
+
+    @param[out] result  created string
+
+    @return whether string creation completed
+    */
+    bool get_msgpack_string(string_t& result)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::msgpack, "string")))
+        {
+            return false;
+        }
+
+        switch (current)
+        {
+            // fixstr
+            case 0xA0:
+            case 0xA1:
+            case 0xA2:
+            case 0xA3:
+            case 0xA4:
+            case 0xA5:
+            case 0xA6:
+            case 0xA7:
+            case 0xA8:
+            case 0xA9:
+            case 0xAA:
+            case 0xAB:
+            case 0xAC:
+            case 0xAD:
+            case 0xAE:
+            case 0xAF:
+            case 0xB0:
+            case 0xB1:
+            case 0xB2:
+            case 0xB3:
+            case 0xB4:
+            case 0xB5:
+            case 0xB6:
+            case 0xB7:
+            case 0xB8:
+            case 0xB9:
+            case 0xBA:
+            case 0xBB:
+            case 0xBC:
+            case 0xBD:
+            case 0xBE:
+            case 0xBF:
+            {
+                return get_string(input_format_t::msgpack, static_cast<unsigned int>(current) & 0x1Fu, result);
+            }
+
+            case 0xD9: // str 8
+            {
+                std::uint8_t len{};
+                return get_number(input_format_t::msgpack, len) && get_string(input_format_t::msgpack, len, result);
+            }
+
+            case 0xDA: // str 16
+            {
+                std::uint16_t len{};
+                return get_number(input_format_t::msgpack, len) && get_string(input_format_t::msgpack, len, result);
+            }
+
+            case 0xDB: // str 32
+            {
+                std::uint32_t len{};
+                return get_number(input_format_t::msgpack, len) && get_string(input_format_t::msgpack, len, result);
+            }
+
+            default:
+            {
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(113, chars_read, exception_message(input_format_t::msgpack, "expected length specification (0xA0-0xBF, 0xD9-0xDB); last byte: 0x" + last_token, "string"), BasicJsonType()));
+            }
+        }
+    }
+
+    /*!
+    @brief reads a MessagePack byte array
+
+    This function first reads starting bytes to determine the expected
+    byte array length and then copies this number of bytes into a byte array.
+
+    @param[out] result  created byte array
+
+    @return whether byte array creation completed
+    */
+    bool get_msgpack_binary(binary_t& result)
+    {
+        // helper function to set the subtype
+        auto assign_and_return_true = [&result](std::int8_t subtype)
+        {
+            result.set_subtype(static_cast<std::uint8_t>(subtype));
+            return true;
+        };
+
+        switch (current)
+        {
+            case 0xC4: // bin 8
+            {
+                std::uint8_t len{};
+                return get_number(input_format_t::msgpack, len) &&
+                       get_binary(input_format_t::msgpack, len, result);
+            }
+
+            case 0xC5: // bin 16
+            {
+                std::uint16_t len{};
+                return get_number(input_format_t::msgpack, len) &&
+                       get_binary(input_format_t::msgpack, len, result);
+            }
+
+            case 0xC6: // bin 32
+            {
+                std::uint32_t len{};
+                return get_number(input_format_t::msgpack, len) &&
+                       get_binary(input_format_t::msgpack, len, result);
+            }
+
+            case 0xC7: // ext 8
+            {
+                std::uint8_t len{};
+                std::int8_t subtype{};
+                return get_number(input_format_t::msgpack, len) &&
+                       get_number(input_format_t::msgpack, subtype) &&
+                       get_binary(input_format_t::msgpack, len, result) &&
+                       assign_and_return_true(subtype);
+            }
+
+            case 0xC8: // ext 16
+            {
+                std::uint16_t len{};
+                std::int8_t subtype{};
+                return get_number(input_format_t::msgpack, len) &&
+                       get_number(input_format_t::msgpack, subtype) &&
+                       get_binary(input_format_t::msgpack, len, result) &&
+                       assign_and_return_true(subtype);
+            }
+
+            case 0xC9: // ext 32
+            {
+                std::uint32_t len{};
+                std::int8_t subtype{};
+                return get_number(input_format_t::msgpack, len) &&
+                       get_number(input_format_t::msgpack, subtype) &&
+                       get_binary(input_format_t::msgpack, len, result) &&
+                       assign_and_return_true(subtype);
+            }
+
+            case 0xD4: // fixext 1
+            {
+                std::int8_t subtype{};
+                return get_number(input_format_t::msgpack, subtype) &&
+                       get_binary(input_format_t::msgpack, 1, result) &&
+                       assign_and_return_true(subtype);
+            }
+
+            case 0xD5: // fixext 2
+            {
+                std::int8_t subtype{};
+                return get_number(input_format_t::msgpack, subtype) &&
+                       get_binary(input_format_t::msgpack, 2, result) &&
+                       assign_and_return_true(subtype);
+            }
+
+            case 0xD6: // fixext 4
+            {
+                std::int8_t subtype{};
+                return get_number(input_format_t::msgpack, subtype) &&
+                       get_binary(input_format_t::msgpack, 4, result) &&
+                       assign_and_return_true(subtype);
+            }
+
+            case 0xD7: // fixext 8
+            {
+                std::int8_t subtype{};
+                return get_number(input_format_t::msgpack, subtype) &&
+                       get_binary(input_format_t::msgpack, 8, result) &&
+                       assign_and_return_true(subtype);
+            }
+
+            case 0xD8: // fixext 16
+            {
+                std::int8_t subtype{};
+                return get_number(input_format_t::msgpack, subtype) &&
+                       get_binary(input_format_t::msgpack, 16, result) &&
+                       assign_and_return_true(subtype);
+            }
+
+            default:           // LCOV_EXCL_LINE
+                return false;  // LCOV_EXCL_LINE
+        }
+    }
+
+    /*!
+    @param[in] len  the length of the array
+    @return whether array creation completed
+    */
+    bool get_msgpack_array(const std::size_t len)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!sax->start_array(len)))
+        {
+            return false;
+        }
+
+        for (std::size_t i = 0; i < len; ++i)
+        {
+            if (JSON_HEDLEY_UNLIKELY(!parse_msgpack_internal()))
+            {
+                return false;
+            }
+        }
+
+        return sax->end_array();
+    }
+
+    /*!
+    @param[in] len  the length of the object
+    @return whether object creation completed
+    */
+    bool get_msgpack_object(const std::size_t len)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!sax->start_object(len)))
+        {
+            return false;
+        }
+
+        string_t key;
+        for (std::size_t i = 0; i < len; ++i)
+        {
+            get();
+            if (JSON_HEDLEY_UNLIKELY(!get_msgpack_string(key) || !sax->key(key)))
+            {
+                return false;
+            }
+
+            if (JSON_HEDLEY_UNLIKELY(!parse_msgpack_internal()))
+            {
+                return false;
+            }
+            key.clear();
+        }
+
+        return sax->end_object();
+    }
+
+    ////////////
+    // UBJSON //
+    ////////////
+
+    /*!
+    @param[in] get_char  whether a new character should be retrieved from the
+                         input (true, default) or whether the last read
+                         character should be considered instead
+
+    @return whether a valid UBJSON value was passed to the SAX parser
+    */
+    bool parse_ubjson_internal(const bool get_char = true)
+    {
+        return get_ubjson_value(get_char ? get_ignore_noop() : current);
+    }
+
+    /*!
+    @brief reads a UBJSON string
+
+    This function is either called after reading the 'S' byte explicitly
+    indicating a string, or in case of an object key where the 'S' byte can be
+    left out.
+
+    @param[out] result   created string
+    @param[in] get_char  whether a new character should be retrieved from the
+                         input (true, default) or whether the last read
+                         character should be considered instead
+
+    @return whether string creation completed
+    */
+    bool get_ubjson_string(string_t& result, const bool get_char = true)
+    {
+        if (get_char)
+        {
+            get();  // TODO(niels): may we ignore N here?
+        }
+
+        if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::ubjson, "value")))
+        {
+            return false;
+        }
+
+        switch (current)
+        {
+            case 'U':
+            {
+                std::uint8_t len{};
+                return get_number(input_format_t::ubjson, len) && get_string(input_format_t::ubjson, len, result);
+            }
+
+            case 'i':
+            {
+                std::int8_t len{};
+                return get_number(input_format_t::ubjson, len) && get_string(input_format_t::ubjson, len, result);
+            }
+
+            case 'I':
+            {
+                std::int16_t len{};
+                return get_number(input_format_t::ubjson, len) && get_string(input_format_t::ubjson, len, result);
+            }
+
+            case 'l':
+            {
+                std::int32_t len{};
+                return get_number(input_format_t::ubjson, len) && get_string(input_format_t::ubjson, len, result);
+            }
+
+            case 'L':
+            {
+                std::int64_t len{};
+                return get_number(input_format_t::ubjson, len) && get_string(input_format_t::ubjson, len, result);
+            }
+
+            default:
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(113, chars_read, exception_message(input_format_t::ubjson, "expected length type specification (U, i, I, l, L); last byte: 0x" + last_token, "string"), BasicJsonType()));
+        }
+    }
+
+    /*!
+    @param[out] result  determined size
+    @return whether size determination completed
+    */
+    bool get_ubjson_size_value(std::size_t& result)
+    {
+        switch (get_ignore_noop())
+        {
+            case 'U':
+            {
+                std::uint8_t number{};
+                if (JSON_HEDLEY_UNLIKELY(!get_number(input_format_t::ubjson, number)))
+                {
+                    return false;
+                }
+                result = static_cast<std::size_t>(number);
+                return true;
+            }
+
+            case 'i':
+            {
+                std::int8_t number{};
+                if (JSON_HEDLEY_UNLIKELY(!get_number(input_format_t::ubjson, number)))
+                {
+                    return false;
+                }
+                result = static_cast<std::size_t>(number); // NOLINT(bugprone-signed-char-misuse,cert-str34-c): number is not a char
+                return true;
+            }
+
+            case 'I':
+            {
+                std::int16_t number{};
+                if (JSON_HEDLEY_UNLIKELY(!get_number(input_format_t::ubjson, number)))
+                {
+                    return false;
+                }
+                result = static_cast<std::size_t>(number);
+                return true;
+            }
+
+            case 'l':
+            {
+                std::int32_t number{};
+                if (JSON_HEDLEY_UNLIKELY(!get_number(input_format_t::ubjson, number)))
+                {
+                    return false;
+                }
+                result = static_cast<std::size_t>(number);
+                return true;
+            }
+
+            case 'L':
+            {
+                std::int64_t number{};
+                if (JSON_HEDLEY_UNLIKELY(!get_number(input_format_t::ubjson, number)))
+                {
+                    return false;
+                }
+                result = static_cast<std::size_t>(number);
+                return true;
+            }
+
+            default:
+            {
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(113, chars_read, exception_message(input_format_t::ubjson, "expected length type specification (U, i, I, l, L) after '#'; last byte: 0x" + last_token, "size"), BasicJsonType()));
+            }
+        }
+    }
+
+    /*!
+    @brief determine the type and size for a container
+
+    In the optimized UBJSON format, a type and a size can be provided to allow
+    for a more compact representation.
+
+    @param[out] result  pair of the size and the type
+
+    @return whether pair creation completed
+    */
+    bool get_ubjson_size_type(std::pair<std::size_t, char_int_type>& result)
+    {
+        result.first = string_t::npos; // size
+        result.second = 0; // type
+
+        get_ignore_noop();
+
+        if (current == '$')
+        {
+            result.second = get();  // must not ignore 'N', because 'N' maybe the type
+            if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::ubjson, "type")))
+            {
+                return false;
+            }
+
+            get_ignore_noop();
+            if (JSON_HEDLEY_UNLIKELY(current != '#'))
+            {
+                if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::ubjson, "value")))
+                {
+                    return false;
+                }
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(112, chars_read, exception_message(input_format_t::ubjson, "expected '#' after type information; last byte: 0x" + last_token, "size"), BasicJsonType()));
+            }
+
+            return get_ubjson_size_value(result.first);
+        }
+
+        if (current == '#')
+        {
+            return get_ubjson_size_value(result.first);
+        }
+
+        return true;
+    }
+
+    /*!
+    @param prefix  the previously read or set type prefix
+    @return whether value creation completed
+    */
+    bool get_ubjson_value(const char_int_type prefix)
+    {
+        switch (prefix)
+        {
+            case std::char_traits<char_type>::eof():  // EOF
+                return unexpect_eof(input_format_t::ubjson, "value");
+
+            case 'T':  // true
+                return sax->boolean(true);
+            case 'F':  // false
+                return sax->boolean(false);
+
+            case 'Z':  // null
+                return sax->null();
+
+            case 'U':
+            {
+                std::uint8_t number{};
+                return get_number(input_format_t::ubjson, number) && sax->number_unsigned(number);
+            }
+
+            case 'i':
+            {
+                std::int8_t number{};
+                return get_number(input_format_t::ubjson, number) && sax->number_integer(number);
+            }
+
+            case 'I':
+            {
+                std::int16_t number{};
+                return get_number(input_format_t::ubjson, number) && sax->number_integer(number);
+            }
+
+            case 'l':
+            {
+                std::int32_t number{};
+                return get_number(input_format_t::ubjson, number) && sax->number_integer(number);
+            }
+
+            case 'L':
+            {
+                std::int64_t number{};
+                return get_number(input_format_t::ubjson, number) && sax->number_integer(number);
+            }
+
+            case 'd':
+            {
+                float number{};
+                return get_number(input_format_t::ubjson, number) && sax->number_float(static_cast<number_float_t>(number), "");
+            }
+
+            case 'D':
+            {
+                double number{};
+                return get_number(input_format_t::ubjson, number) && sax->number_float(static_cast<number_float_t>(number), "");
+            }
+
+            case 'H':
+            {
+                return get_ubjson_high_precision_number();
+            }
+
+            case 'C':  // char
+            {
+                get();
+                if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::ubjson, "char")))
+                {
+                    return false;
+                }
+                if (JSON_HEDLEY_UNLIKELY(current > 127))
+                {
+                    auto last_token = get_token_string();
+                    return sax->parse_error(chars_read, last_token, parse_error::create(113, chars_read, exception_message(input_format_t::ubjson, "byte after 'C' must be in range 0x00..0x7F; last byte: 0x" + last_token, "char"), BasicJsonType()));
+                }
+                string_t s(1, static_cast<typename string_t::value_type>(current));
+                return sax->string(s);
+            }
+
+            case 'S':  // string
+            {
+                string_t s;
+                return get_ubjson_string(s) && sax->string(s);
+            }
+
+            case '[':  // array
+                return get_ubjson_array();
+
+            case '{':  // object
+                return get_ubjson_object();
+
+            default: // anything else
+            {
+                auto last_token = get_token_string();
+                return sax->parse_error(chars_read, last_token, parse_error::create(112, chars_read, exception_message(input_format_t::ubjson, "invalid byte: 0x" + last_token, "value"), BasicJsonType()));
+            }
+        }
+    }
+
+    /*!
+    @return whether array creation completed
+    */
+    bool get_ubjson_array()
+    {
+        std::pair<std::size_t, char_int_type> size_and_type;
+        if (JSON_HEDLEY_UNLIKELY(!get_ubjson_size_type(size_and_type)))
+        {
+            return false;
+        }
+
+        if (size_and_type.first != string_t::npos)
+        {
+            if (JSON_HEDLEY_UNLIKELY(!sax->start_array(size_and_type.first)))
+            {
+                return false;
+            }
+
+            if (size_and_type.second != 0)
+            {
+                if (size_and_type.second != 'N')
+                {
+                    for (std::size_t i = 0; i < size_and_type.first; ++i)
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!get_ubjson_value(size_and_type.second)))
+                        {
+                            return false;
+                        }
+                    }
+                }
+            }
+            else
+            {
+                for (std::size_t i = 0; i < size_and_type.first; ++i)
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!parse_ubjson_internal()))
+                    {
+                        return false;
+                    }
+                }
+            }
+        }
+        else
+        {
+            if (JSON_HEDLEY_UNLIKELY(!sax->start_array(static_cast<std::size_t>(-1))))
+            {
+                return false;
+            }
+
+            while (current != ']')
+            {
+                if (JSON_HEDLEY_UNLIKELY(!parse_ubjson_internal(false)))
+                {
+                    return false;
+                }
+                get_ignore_noop();
+            }
+        }
+
+        return sax->end_array();
+    }
+
+    /*!
+    @return whether object creation completed
+    */
+    bool get_ubjson_object()
+    {
+        std::pair<std::size_t, char_int_type> size_and_type;
+        if (JSON_HEDLEY_UNLIKELY(!get_ubjson_size_type(size_and_type)))
+        {
+            return false;
+        }
+
+        string_t key;
+        if (size_and_type.first != string_t::npos)
+        {
+            if (JSON_HEDLEY_UNLIKELY(!sax->start_object(size_and_type.first)))
+            {
+                return false;
+            }
+
+            if (size_and_type.second != 0)
+            {
+                for (std::size_t i = 0; i < size_and_type.first; ++i)
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!get_ubjson_string(key) || !sax->key(key)))
+                    {
+                        return false;
+                    }
+                    if (JSON_HEDLEY_UNLIKELY(!get_ubjson_value(size_and_type.second)))
+                    {
+                        return false;
+                    }
+                    key.clear();
+                }
+            }
+            else
+            {
+                for (std::size_t i = 0; i < size_and_type.first; ++i)
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!get_ubjson_string(key) || !sax->key(key)))
+                    {
+                        return false;
+                    }
+                    if (JSON_HEDLEY_UNLIKELY(!parse_ubjson_internal()))
+                    {
+                        return false;
+                    }
+                    key.clear();
+                }
+            }
+        }
+        else
+        {
+            if (JSON_HEDLEY_UNLIKELY(!sax->start_object(static_cast<std::size_t>(-1))))
+            {
+                return false;
+            }
+
+            while (current != '}')
+            {
+                if (JSON_HEDLEY_UNLIKELY(!get_ubjson_string(key, false) || !sax->key(key)))
+                {
+                    return false;
+                }
+                if (JSON_HEDLEY_UNLIKELY(!parse_ubjson_internal()))
+                {
+                    return false;
+                }
+                get_ignore_noop();
+                key.clear();
+            }
+        }
+
+        return sax->end_object();
+    }
+
+    // Note, no reader for UBJSON binary types is implemented because they do
+    // not exist
+
+    bool get_ubjson_high_precision_number()
+    {
+        // get size of following number string
+        std::size_t size{};
+        auto res = get_ubjson_size_value(size);
+        if (JSON_HEDLEY_UNLIKELY(!res))
+        {
+            return res;
+        }
+
+        // get number string
+        std::vector<char> number_vector;
+        for (std::size_t i = 0; i < size; ++i)
+        {
+            get();
+            if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(input_format_t::ubjson, "number")))
+            {
+                return false;
+            }
+            number_vector.push_back(static_cast<char>(current));
+        }
+
+        // parse number string
+        using ia_type = decltype(detail::input_adapter(number_vector));
+        auto number_lexer = detail::lexer<BasicJsonType, ia_type>(detail::input_adapter(number_vector), false);
+        const auto result_number = number_lexer.scan();
+        const auto number_string = number_lexer.get_token_string();
+        const auto result_remainder = number_lexer.scan();
+
+        using token_type = typename detail::lexer_base<BasicJsonType>::token_type;
+
+        if (JSON_HEDLEY_UNLIKELY(result_remainder != token_type::end_of_input))
+        {
+            return sax->parse_error(chars_read, number_string, parse_error::create(115, chars_read, exception_message(input_format_t::ubjson, "invalid number text: " + number_lexer.get_token_string(), "high-precision number"), BasicJsonType()));
+        }
+
+        switch (result_number)
+        {
+            case token_type::value_integer:
+                return sax->number_integer(number_lexer.get_number_integer());
+            case token_type::value_unsigned:
+                return sax->number_unsigned(number_lexer.get_number_unsigned());
+            case token_type::value_float:
+                return sax->number_float(number_lexer.get_number_float(), std::move(number_string));
+            case token_type::uninitialized:
+            case token_type::literal_true:
+            case token_type::literal_false:
+            case token_type::literal_null:
+            case token_type::value_string:
+            case token_type::begin_array:
+            case token_type::begin_object:
+            case token_type::end_array:
+            case token_type::end_object:
+            case token_type::name_separator:
+            case token_type::value_separator:
+            case token_type::parse_error:
+            case token_type::end_of_input:
+            case token_type::literal_or_value:
+            default:
+                return sax->parse_error(chars_read, number_string, parse_error::create(115, chars_read, exception_message(input_format_t::ubjson, "invalid number text: " + number_lexer.get_token_string(), "high-precision number"), BasicJsonType()));
+        }
+    }
+
+    ///////////////////////
+    // Utility functions //
+    ///////////////////////
+
+    /*!
+    @brief get next character from the input
+
+    This function provides the interface to the used input adapter. It does
+    not throw in case the input reached EOF, but returns a -'ve valued
+    `std::char_traits<char_type>::eof()` in that case.
+
+    @return character read from the input
+    */
+    char_int_type get()
+    {
+        ++chars_read;
+        return current = ia.get_character();
+    }
+
+    /*!
+    @return character read from the input after ignoring all 'N' entries
+    */
+    char_int_type get_ignore_noop()
+    {
+        do
+        {
+            get();
+        }
+        while (current == 'N');
+
+        return current;
+    }
+
+    /*
+    @brief read a number from the input
+
+    @tparam NumberType the type of the number
+    @param[in] format   the current format (for diagnostics)
+    @param[out] result  number of type @a NumberType
+
+    @return whether conversion completed
+
+    @note This function needs to respect the system's endianness, because
+          bytes in CBOR, MessagePack, and UBJSON are stored in network order
+          (big endian) and therefore need reordering on little endian systems.
+    */
+    template<typename NumberType, bool InputIsLittleEndian = false>
+    bool get_number(const input_format_t format, NumberType& result)
+    {
+        // step 1: read input into array with system's byte order
+        std::array<std::uint8_t, sizeof(NumberType)> vec{};
+        for (std::size_t i = 0; i < sizeof(NumberType); ++i)
+        {
+            get();
+            if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(format, "number")))
+            {
+                return false;
+            }
+
+            // reverse byte order prior to conversion if necessary
+            if (is_little_endian != InputIsLittleEndian)
+            {
+                vec[sizeof(NumberType) - i - 1] = static_cast<std::uint8_t>(current);
+            }
+            else
+            {
+                vec[i] = static_cast<std::uint8_t>(current); // LCOV_EXCL_LINE
+            }
+        }
+
+        // step 2: convert array into number of type T and return
+        std::memcpy(&result, vec.data(), sizeof(NumberType));
+        return true;
+    }
+
+    /*!
+    @brief create a string by reading characters from the input
+
+    @tparam NumberType the type of the number
+    @param[in] format the current format (for diagnostics)
+    @param[in] len number of characters to read
+    @param[out] result string created by reading @a len bytes
+
+    @return whether string creation completed
+
+    @note We can not reserve @a len bytes for the result, because @a len
+          may be too large. Usually, @ref unexpect_eof() detects the end of
+          the input before we run out of string memory.
+    */
+    template<typename NumberType>
+    bool get_string(const input_format_t format,
+                    const NumberType len,
+                    string_t& result)
+    {
+        bool success = true;
+        for (NumberType i = 0; i < len; i++)
+        {
+            get();
+            if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(format, "string")))
+            {
+                success = false;
+                break;
+            }
+            result.push_back(static_cast<typename string_t::value_type>(current));
+        }
+        return success;
+    }
+
+    /*!
+    @brief create a byte array by reading bytes from the input
+
+    @tparam NumberType the type of the number
+    @param[in] format the current format (for diagnostics)
+    @param[in] len number of bytes to read
+    @param[out] result byte array created by reading @a len bytes
+
+    @return whether byte array creation completed
+
+    @note We can not reserve @a len bytes for the result, because @a len
+          may be too large. Usually, @ref unexpect_eof() detects the end of
+          the input before we run out of memory.
+    */
+    template<typename NumberType>
+    bool get_binary(const input_format_t format,
+                    const NumberType len,
+                    binary_t& result)
+    {
+        bool success = true;
+        for (NumberType i = 0; i < len; i++)
+        {
+            get();
+            if (JSON_HEDLEY_UNLIKELY(!unexpect_eof(format, "binary")))
+            {
+                success = false;
+                break;
+            }
+            result.push_back(static_cast<std::uint8_t>(current));
+        }
+        return success;
+    }
+
+    /*!
+    @param[in] format   the current format (for diagnostics)
+    @param[in] context  further context information (for diagnostics)
+    @return whether the last read character is not EOF
+    */
+    JSON_HEDLEY_NON_NULL(3)
+    bool unexpect_eof(const input_format_t format, const char* context) const
+    {
+        if (JSON_HEDLEY_UNLIKELY(current == std::char_traits<char_type>::eof()))
+        {
+            return sax->parse_error(chars_read, "<end of file>",
+                                    parse_error::create(110, chars_read, exception_message(format, "unexpected end of input", context), BasicJsonType()));
+        }
+        return true;
+    }
+
+    /*!
+    @return a string representation of the last read byte
+    */
+    std::string get_token_string() const
+    {
+        std::array<char, 3> cr{{}};
+        static_cast<void>((std::snprintf)(cr.data(), cr.size(), "%.2hhX", static_cast<unsigned char>(current))); // NOLINT(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+        return std::string{cr.data()};
+    }
+
+    /*!
+    @param[in] format   the current format
+    @param[in] detail   a detailed error message
+    @param[in] context  further context information
+    @return a message string to use in the parse_error exceptions
+    */
+    std::string exception_message(const input_format_t format,
+                                  const std::string& detail,
+                                  const std::string& context) const
+    {
+        std::string error_msg = "syntax error while parsing ";
+
+        switch (format)
+        {
+            case input_format_t::cbor:
+                error_msg += "CBOR";
+                break;
+
+            case input_format_t::msgpack:
+                error_msg += "MessagePack";
+                break;
+
+            case input_format_t::ubjson:
+                error_msg += "UBJSON";
+                break;
+
+            case input_format_t::bson:
+                error_msg += "BSON";
+                break;
+
+            case input_format_t::json: // LCOV_EXCL_LINE
+            default:            // LCOV_EXCL_LINE
+                JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+        }
+
+        return error_msg + " " + context + ": " + detail;
+    }
+
+  private:
+    /// input adapter
+    InputAdapterType ia;
+
+    /// the current character
+    char_int_type current = std::char_traits<char_type>::eof();
+
+    /// the number of characters read
+    std::size_t chars_read = 0;
+
+    /// whether we can assume little endianness
+    const bool is_little_endian = little_endianness();
+
+    /// the SAX parser
+    json_sax_t* sax = nullptr;
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/input/input_adapters.hpp>
+
+// #include <nlohmann/detail/input/lexer.hpp>
+
+// #include <nlohmann/detail/input/parser.hpp>
+
+
+#include <cmath> // isfinite
+#include <cstdint> // uint8_t
+#include <functional> // function
+#include <string> // string
+#include <utility> // move
+#include <vector> // vector
+
+// #include <nlohmann/detail/exceptions.hpp>
+
+// #include <nlohmann/detail/input/input_adapters.hpp>
+
+// #include <nlohmann/detail/input/json_sax.hpp>
+
+// #include <nlohmann/detail/input/lexer.hpp>
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/meta/is_sax.hpp>
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+////////////
+// parser //
+////////////
+
+enum class parse_event_t : std::uint8_t
+{
+    /// the parser read `{` and started to process a JSON object
+    object_start,
+    /// the parser read `}` and finished processing a JSON object
+    object_end,
+    /// the parser read `[` and started to process a JSON array
+    array_start,
+    /// the parser read `]` and finished processing a JSON array
+    array_end,
+    /// the parser read a key of a value in an object
+    key,
+    /// the parser finished reading a JSON value
+    value
+};
+
+template<typename BasicJsonType>
+using parser_callback_t =
+    std::function<bool(int /*depth*/, parse_event_t /*event*/, BasicJsonType& /*parsed*/)>;
+
+/*!
+@brief syntax analysis
+
+This class implements a recursive descent parser.
+*/
+template<typename BasicJsonType, typename InputAdapterType>
+class parser
+{
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using string_t = typename BasicJsonType::string_t;
+    using lexer_t = lexer<BasicJsonType, InputAdapterType>;
+    using token_type = typename lexer_t::token_type;
+
+  public:
+    /// a parser reading from an input adapter
+    explicit parser(InputAdapterType&& adapter,
+                    const parser_callback_t<BasicJsonType> cb = nullptr,
+                    const bool allow_exceptions_ = true,
+                    const bool skip_comments = false)
+        : callback(cb)
+        , m_lexer(std::move(adapter), skip_comments)
+        , allow_exceptions(allow_exceptions_)
+    {
+        // read first token
+        get_token();
+    }
+
+    /*!
+    @brief public parser interface
+
+    @param[in] strict      whether to expect the last token to be EOF
+    @param[in,out] result  parsed JSON value
+
+    @throw parse_error.101 in case of an unexpected token
+    @throw parse_error.102 if to_unicode fails or surrogate error
+    @throw parse_error.103 if to_unicode fails
+    */
+    void parse(const bool strict, BasicJsonType& result)
+    {
+        if (callback)
+        {
+            json_sax_dom_callback_parser<BasicJsonType> sdp(result, callback, allow_exceptions);
+            sax_parse_internal(&sdp);
+
+            // in strict mode, input must be completely read
+            if (strict && (get_token() != token_type::end_of_input))
+            {
+                sdp.parse_error(m_lexer.get_position(),
+                                m_lexer.get_token_string(),
+                                parse_error::create(101, m_lexer.get_position(),
+                                                    exception_message(token_type::end_of_input, "value"), BasicJsonType()));
+            }
+
+            // in case of an error, return discarded value
+            if (sdp.is_errored())
+            {
+                result = value_t::discarded;
+                return;
+            }
+
+            // set top-level value to null if it was discarded by the callback
+            // function
+            if (result.is_discarded())
+            {
+                result = nullptr;
+            }
+        }
+        else
+        {
+            json_sax_dom_parser<BasicJsonType> sdp(result, allow_exceptions);
+            sax_parse_internal(&sdp);
+
+            // in strict mode, input must be completely read
+            if (strict && (get_token() != token_type::end_of_input))
+            {
+                sdp.parse_error(m_lexer.get_position(),
+                                m_lexer.get_token_string(),
+                                parse_error::create(101, m_lexer.get_position(), exception_message(token_type::end_of_input, "value"), BasicJsonType()));
+            }
+
+            // in case of an error, return discarded value
+            if (sdp.is_errored())
+            {
+                result = value_t::discarded;
+                return;
+            }
+        }
+
+        result.assert_invariant();
+    }
+
+    /*!
+    @brief public accept interface
+
+    @param[in] strict  whether to expect the last token to be EOF
+    @return whether the input is a proper JSON text
+    */
+    bool accept(const bool strict = true)
+    {
+        json_sax_acceptor<BasicJsonType> sax_acceptor;
+        return sax_parse(&sax_acceptor, strict);
+    }
+
+    template<typename SAX>
+    JSON_HEDLEY_NON_NULL(2)
+    bool sax_parse(SAX* sax, const bool strict = true)
+    {
+        (void)detail::is_sax_static_asserts<SAX, BasicJsonType> {};
+        const bool result = sax_parse_internal(sax);
+
+        // strict mode: next byte must be EOF
+        if (result && strict && (get_token() != token_type::end_of_input))
+        {
+            return sax->parse_error(m_lexer.get_position(),
+                                    m_lexer.get_token_string(),
+                                    parse_error::create(101, m_lexer.get_position(), exception_message(token_type::end_of_input, "value"), BasicJsonType()));
+        }
+
+        return result;
+    }
+
+  private:
+    template<typename SAX>
+    JSON_HEDLEY_NON_NULL(2)
+    bool sax_parse_internal(SAX* sax)
+    {
+        // stack to remember the hierarchy of structured values we are parsing
+        // true = array; false = object
+        std::vector<bool> states;
+        // value to avoid a goto (see comment where set to true)
+        bool skip_to_state_evaluation = false;
+
+        while (true)
+        {
+            if (!skip_to_state_evaluation)
+            {
+                // invariant: get_token() was called before each iteration
+                switch (last_token)
+                {
+                    case token_type::begin_object:
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!sax->start_object(static_cast<std::size_t>(-1))))
+                        {
+                            return false;
+                        }
+
+                        // closing } -> we are done
+                        if (get_token() == token_type::end_object)
+                        {
+                            if (JSON_HEDLEY_UNLIKELY(!sax->end_object()))
+                            {
+                                return false;
+                            }
+                            break;
+                        }
+
+                        // parse key
+                        if (JSON_HEDLEY_UNLIKELY(last_token != token_type::value_string))
+                        {
+                            return sax->parse_error(m_lexer.get_position(),
+                                                    m_lexer.get_token_string(),
+                                                    parse_error::create(101, m_lexer.get_position(), exception_message(token_type::value_string, "object key"), BasicJsonType()));
+                        }
+                        if (JSON_HEDLEY_UNLIKELY(!sax->key(m_lexer.get_string())))
+                        {
+                            return false;
+                        }
+
+                        // parse separator (:)
+                        if (JSON_HEDLEY_UNLIKELY(get_token() != token_type::name_separator))
+                        {
+                            return sax->parse_error(m_lexer.get_position(),
+                                                    m_lexer.get_token_string(),
+                                                    parse_error::create(101, m_lexer.get_position(), exception_message(token_type::name_separator, "object separator"), BasicJsonType()));
+                        }
+
+                        // remember we are now inside an object
+                        states.push_back(false);
+
+                        // parse values
+                        get_token();
+                        continue;
+                    }
+
+                    case token_type::begin_array:
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!sax->start_array(static_cast<std::size_t>(-1))))
+                        {
+                            return false;
+                        }
+
+                        // closing ] -> we are done
+                        if (get_token() == token_type::end_array)
+                        {
+                            if (JSON_HEDLEY_UNLIKELY(!sax->end_array()))
+                            {
+                                return false;
+                            }
+                            break;
+                        }
+
+                        // remember we are now inside an array
+                        states.push_back(true);
+
+                        // parse values (no need to call get_token)
+                        continue;
+                    }
+
+                    case token_type::value_float:
+                    {
+                        const auto res = m_lexer.get_number_float();
+
+                        if (JSON_HEDLEY_UNLIKELY(!std::isfinite(res)))
+                        {
+                            return sax->parse_error(m_lexer.get_position(),
+                                                    m_lexer.get_token_string(),
+                                                    out_of_range::create(406, "number overflow parsing '" + m_lexer.get_token_string() + "'", BasicJsonType()));
+                        }
+
+                        if (JSON_HEDLEY_UNLIKELY(!sax->number_float(res, m_lexer.get_string())))
+                        {
+                            return false;
+                        }
+
+                        break;
+                    }
+
+                    case token_type::literal_false:
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!sax->boolean(false)))
+                        {
+                            return false;
+                        }
+                        break;
+                    }
+
+                    case token_type::literal_null:
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!sax->null()))
+                        {
+                            return false;
+                        }
+                        break;
+                    }
+
+                    case token_type::literal_true:
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!sax->boolean(true)))
+                        {
+                            return false;
+                        }
+                        break;
+                    }
+
+                    case token_type::value_integer:
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!sax->number_integer(m_lexer.get_number_integer())))
+                        {
+                            return false;
+                        }
+                        break;
+                    }
+
+                    case token_type::value_string:
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!sax->string(m_lexer.get_string())))
+                        {
+                            return false;
+                        }
+                        break;
+                    }
+
+                    case token_type::value_unsigned:
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!sax->number_unsigned(m_lexer.get_number_unsigned())))
+                        {
+                            return false;
+                        }
+                        break;
+                    }
+
+                    case token_type::parse_error:
+                    {
+                        // using "uninitialized" to avoid "expected" message
+                        return sax->parse_error(m_lexer.get_position(),
+                                                m_lexer.get_token_string(),
+                                                parse_error::create(101, m_lexer.get_position(), exception_message(token_type::uninitialized, "value"), BasicJsonType()));
+                    }
+
+                    case token_type::uninitialized:
+                    case token_type::end_array:
+                    case token_type::end_object:
+                    case token_type::name_separator:
+                    case token_type::value_separator:
+                    case token_type::end_of_input:
+                    case token_type::literal_or_value:
+                    default: // the last token was unexpected
+                    {
+                        return sax->parse_error(m_lexer.get_position(),
+                                                m_lexer.get_token_string(),
+                                                parse_error::create(101, m_lexer.get_position(), exception_message(token_type::literal_or_value, "value"), BasicJsonType()));
+                    }
+                }
+            }
+            else
+            {
+                skip_to_state_evaluation = false;
+            }
+
+            // we reached this line after we successfully parsed a value
+            if (states.empty())
+            {
+                // empty stack: we reached the end of the hierarchy: done
+                return true;
+            }
+
+            if (states.back())  // array
+            {
+                // comma -> next value
+                if (get_token() == token_type::value_separator)
+                {
+                    // parse a new value
+                    get_token();
+                    continue;
+                }
+
+                // closing ]
+                if (JSON_HEDLEY_LIKELY(last_token == token_type::end_array))
+                {
+                    if (JSON_HEDLEY_UNLIKELY(!sax->end_array()))
+                    {
+                        return false;
+                    }
+
+                    // We are done with this array. Before we can parse a
+                    // new value, we need to evaluate the new state first.
+                    // By setting skip_to_state_evaluation to false, we
+                    // are effectively jumping to the beginning of this if.
+                    JSON_ASSERT(!states.empty());
+                    states.pop_back();
+                    skip_to_state_evaluation = true;
+                    continue;
+                }
+
+                return sax->parse_error(m_lexer.get_position(),
+                                        m_lexer.get_token_string(),
+                                        parse_error::create(101, m_lexer.get_position(), exception_message(token_type::end_array, "array"), BasicJsonType()));
+            }
+
+            // states.back() is false -> object
+
+            // comma -> next value
+            if (get_token() == token_type::value_separator)
+            {
+                // parse key
+                if (JSON_HEDLEY_UNLIKELY(get_token() != token_type::value_string))
+                {
+                    return sax->parse_error(m_lexer.get_position(),
+                                            m_lexer.get_token_string(),
+                                            parse_error::create(101, m_lexer.get_position(), exception_message(token_type::value_string, "object key"), BasicJsonType()));
+                }
+
+                if (JSON_HEDLEY_UNLIKELY(!sax->key(m_lexer.get_string())))
+                {
+                    return false;
+                }
+
+                // parse separator (:)
+                if (JSON_HEDLEY_UNLIKELY(get_token() != token_type::name_separator))
+                {
+                    return sax->parse_error(m_lexer.get_position(),
+                                            m_lexer.get_token_string(),
+                                            parse_error::create(101, m_lexer.get_position(), exception_message(token_type::name_separator, "object separator"), BasicJsonType()));
+                }
+
+                // parse values
+                get_token();
+                continue;
+            }
+
+            // closing }
+            if (JSON_HEDLEY_LIKELY(last_token == token_type::end_object))
+            {
+                if (JSON_HEDLEY_UNLIKELY(!sax->end_object()))
+                {
+                    return false;
+                }
+
+                // We are done with this object. Before we can parse a
+                // new value, we need to evaluate the new state first.
+                // By setting skip_to_state_evaluation to false, we
+                // are effectively jumping to the beginning of this if.
+                JSON_ASSERT(!states.empty());
+                states.pop_back();
+                skip_to_state_evaluation = true;
+                continue;
+            }
+
+            return sax->parse_error(m_lexer.get_position(),
+                                    m_lexer.get_token_string(),
+                                    parse_error::create(101, m_lexer.get_position(), exception_message(token_type::end_object, "object"), BasicJsonType()));
+        }
+    }
+
+    /// get next token from lexer
+    token_type get_token()
+    {
+        return last_token = m_lexer.scan();
+    }
+
+    std::string exception_message(const token_type expected, const std::string& context)
+    {
+        std::string error_msg = "syntax error ";
+
+        if (!context.empty())
+        {
+            error_msg += "while parsing " + context + " ";
+        }
+
+        error_msg += "- ";
+
+        if (last_token == token_type::parse_error)
+        {
+            error_msg += std::string(m_lexer.get_error_message()) + "; last read: '" +
+                         m_lexer.get_token_string() + "'";
+        }
+        else
+        {
+            error_msg += "unexpected " + std::string(lexer_t::token_type_name(last_token));
+        }
+
+        if (expected != token_type::uninitialized)
+        {
+            error_msg += "; expected " + std::string(lexer_t::token_type_name(expected));
+        }
+
+        return error_msg;
+    }
+
+  private:
+    /// callback function
+    const parser_callback_t<BasicJsonType> callback = nullptr;
+    /// the type of the last read token
+    token_type last_token = token_type::uninitialized;
+    /// the lexer
+    lexer_t m_lexer;
+    /// whether to throw exceptions in case of errors
+    const bool allow_exceptions = true;
+};
+
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/iterators/internal_iterator.hpp>
+
+
+// #include <nlohmann/detail/iterators/primitive_iterator.hpp>
+
+
+#include <cstddef> // ptrdiff_t
+#include <limits>  // numeric_limits
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+/*
+@brief an iterator for primitive JSON types
+
+This class models an iterator for primitive JSON types (boolean, number,
+string). It's only purpose is to allow the iterator/const_iterator classes
+to "iterate" over primitive values. Internally, the iterator is modeled by
+a `difference_type` variable. Value begin_value (`0`) models the begin,
+end_value (`1`) models past the end.
+*/
+class primitive_iterator_t
+{
+  private:
+    using difference_type = std::ptrdiff_t;
+    static constexpr difference_type begin_value = 0;
+    static constexpr difference_type end_value = begin_value + 1;
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    /// iterator as signed integer type
+    difference_type m_it = (std::numeric_limits<std::ptrdiff_t>::min)();
+
+  public:
+    constexpr difference_type get_value() const noexcept
+    {
+        return m_it;
+    }
+
+    /// set iterator to a defined beginning
+    void set_begin() noexcept
+    {
+        m_it = begin_value;
+    }
+
+    /// set iterator to a defined past the end
+    void set_end() noexcept
+    {
+        m_it = end_value;
+    }
+
+    /// return whether the iterator can be dereferenced
+    constexpr bool is_begin() const noexcept
+    {
+        return m_it == begin_value;
+    }
+
+    /// return whether the iterator is at end
+    constexpr bool is_end() const noexcept
+    {
+        return m_it == end_value;
+    }
+
+    friend constexpr bool operator==(primitive_iterator_t lhs, primitive_iterator_t rhs) noexcept
+    {
+        return lhs.m_it == rhs.m_it;
+    }
+
+    friend constexpr bool operator<(primitive_iterator_t lhs, primitive_iterator_t rhs) noexcept
+    {
+        return lhs.m_it < rhs.m_it;
+    }
+
+    primitive_iterator_t operator+(difference_type n) noexcept
+    {
+        auto result = *this;
+        result += n;
+        return result;
+    }
+
+    friend constexpr difference_type operator-(primitive_iterator_t lhs, primitive_iterator_t rhs) noexcept
+    {
+        return lhs.m_it - rhs.m_it;
+    }
+
+    primitive_iterator_t& operator++() noexcept
+    {
+        ++m_it;
+        return *this;
+    }
+
+    primitive_iterator_t const operator++(int) noexcept // NOLINT(readability-const-return-type)
+    {
+        auto result = *this;
+        ++m_it;
+        return result;
+    }
+
+    primitive_iterator_t& operator--() noexcept
+    {
+        --m_it;
+        return *this;
+    }
+
+    primitive_iterator_t const operator--(int) noexcept // NOLINT(readability-const-return-type)
+    {
+        auto result = *this;
+        --m_it;
+        return result;
+    }
+
+    primitive_iterator_t& operator+=(difference_type n) noexcept
+    {
+        m_it += n;
+        return *this;
+    }
+
+    primitive_iterator_t& operator-=(difference_type n) noexcept
+    {
+        m_it -= n;
+        return *this;
+    }
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+
+namespace nlohmann
+{
+namespace detail
+{
+/*!
+@brief an iterator value
+
+@note This structure could easily be a union, but MSVC currently does not allow
+unions members with complex constructors, see https://github.com/nlohmann/json/pull/105.
+*/
+template<typename BasicJsonType> struct internal_iterator
+{
+    /// iterator for JSON objects
+    typename BasicJsonType::object_t::iterator object_iterator {};
+    /// iterator for JSON arrays
+    typename BasicJsonType::array_t::iterator array_iterator {};
+    /// generic iterator for all other types
+    primitive_iterator_t primitive_iterator {};
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/iterators/iter_impl.hpp>
+
+
+#include <iterator> // iterator, random_access_iterator_tag, bidirectional_iterator_tag, advance, next
+#include <type_traits> // conditional, is_const, remove_const
+
+// #include <nlohmann/detail/exceptions.hpp>
+
+// #include <nlohmann/detail/iterators/internal_iterator.hpp>
+
+// #include <nlohmann/detail/iterators/primitive_iterator.hpp>
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/meta/cpp_future.hpp>
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+// forward declare, to be able to friend it later on
+template<typename IteratorType> class iteration_proxy;
+template<typename IteratorType> class iteration_proxy_value;
+
+/*!
+@brief a template for a bidirectional iterator for the @ref basic_json class
+This class implements a both iterators (iterator and const_iterator) for the
+@ref basic_json class.
+@note An iterator is called *initialized* when a pointer to a JSON value has
+      been set (e.g., by a constructor or a copy assignment). If the iterator is
+      default-constructed, it is *uninitialized* and most methods are undefined.
+      **The library uses assertions to detect calls on uninitialized iterators.**
+@requirement The class satisfies the following concept requirements:
+-
+[BidirectionalIterator](https://en.cppreference.com/w/cpp/named_req/BidirectionalIterator):
+  The iterator that can be moved can be moved in both directions (i.e.
+  incremented and decremented).
+@since version 1.0.0, simplified in version 2.0.9, change to bidirectional
+       iterators in version 3.0.0 (see https://github.com/nlohmann/json/issues/593)
+*/
+template<typename BasicJsonType>
+class iter_impl // NOLINT(cppcoreguidelines-special-member-functions,hicpp-special-member-functions)
+{
+    /// the iterator with BasicJsonType of different const-ness
+    using other_iter_impl = iter_impl<typename std::conditional<std::is_const<BasicJsonType>::value, typename std::remove_const<BasicJsonType>::type, const BasicJsonType>::type>;
+    /// allow basic_json to access private members
+    friend other_iter_impl;
+    friend BasicJsonType;
+    friend iteration_proxy<iter_impl>;
+    friend iteration_proxy_value<iter_impl>;
+
+    using object_t = typename BasicJsonType::object_t;
+    using array_t = typename BasicJsonType::array_t;
+    // make sure BasicJsonType is basic_json or const basic_json
+    static_assert(is_basic_json<typename std::remove_const<BasicJsonType>::type>::value,
+                  "iter_impl only accepts (const) basic_json");
+
+  public:
+
+    /// The std::iterator class template (used as a base class to provide typedefs) is deprecated in C++17.
+    /// The C++ Standard has never required user-defined iterators to derive from std::iterator.
+    /// A user-defined iterator should provide publicly accessible typedefs named
+    /// iterator_category, value_type, difference_type, pointer, and reference.
+    /// Note that value_type is required to be non-const, even for constant iterators.
+    using iterator_category = std::bidirectional_iterator_tag;
+
+    /// the type of the values when the iterator is dereferenced
+    using value_type = typename BasicJsonType::value_type;
+    /// a type to represent differences between iterators
+    using difference_type = typename BasicJsonType::difference_type;
+    /// defines a pointer to the type iterated over (value_type)
+    using pointer = typename std::conditional<std::is_const<BasicJsonType>::value,
+          typename BasicJsonType::const_pointer,
+          typename BasicJsonType::pointer>::type;
+    /// defines a reference to the type iterated over (value_type)
+    using reference =
+        typename std::conditional<std::is_const<BasicJsonType>::value,
+        typename BasicJsonType::const_reference,
+        typename BasicJsonType::reference>::type;
+
+    iter_impl() = default;
+    ~iter_impl() = default;
+    iter_impl(iter_impl&&) noexcept = default;
+    iter_impl& operator=(iter_impl&&) noexcept = default;
+
+    /*!
+    @brief constructor for a given JSON instance
+    @param[in] object  pointer to a JSON object for this iterator
+    @pre object != nullptr
+    @post The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    explicit iter_impl(pointer object) noexcept : m_object(object)
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+            {
+                m_it.object_iterator = typename object_t::iterator();
+                break;
+            }
+
+            case value_t::array:
+            {
+                m_it.array_iterator = typename array_t::iterator();
+                break;
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                m_it.primitive_iterator = primitive_iterator_t();
+                break;
+            }
+        }
+    }
+
+    /*!
+    @note The conventional copy constructor and copy assignment are implicitly
+          defined. Combined with the following converting constructor and
+          assignment, they support: (1) copy from iterator to iterator, (2)
+          copy from const iterator to const iterator, and (3) conversion from
+          iterator to const iterator. However conversion from const iterator
+          to iterator is not defined.
+    */
+
+    /*!
+    @brief const copy constructor
+    @param[in] other const iterator to copy from
+    @note This copy constructor had to be defined explicitly to circumvent a bug
+          occurring on msvc v19.0 compiler (VS 2015) debug build. For more
+          information refer to: https://github.com/nlohmann/json/issues/1608
+    */
+    iter_impl(const iter_impl<const BasicJsonType>& other) noexcept
+        : m_object(other.m_object), m_it(other.m_it)
+    {}
+
+    /*!
+    @brief converting assignment
+    @param[in] other const iterator to copy from
+    @return const/non-const iterator
+    @note It is not checked whether @a other is initialized.
+    */
+    iter_impl& operator=(const iter_impl<const BasicJsonType>& other) noexcept
+    {
+        if (&other != this)
+        {
+            m_object = other.m_object;
+            m_it = other.m_it;
+        }
+        return *this;
+    }
+
+    /*!
+    @brief converting constructor
+    @param[in] other  non-const iterator to copy from
+    @note It is not checked whether @a other is initialized.
+    */
+    iter_impl(const iter_impl<typename std::remove_const<BasicJsonType>::type>& other) noexcept
+        : m_object(other.m_object), m_it(other.m_it)
+    {}
+
+    /*!
+    @brief converting assignment
+    @param[in] other  non-const iterator to copy from
+    @return const/non-const iterator
+    @note It is not checked whether @a other is initialized.
+    */
+    iter_impl& operator=(const iter_impl<typename std::remove_const<BasicJsonType>::type>& other) noexcept // NOLINT(cert-oop54-cpp)
+    {
+        m_object = other.m_object;
+        m_it = other.m_it;
+        return *this;
+    }
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    /*!
+    @brief set the iterator to the first value
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    void set_begin() noexcept
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+            {
+                m_it.object_iterator = m_object->m_value.object->begin();
+                break;
+            }
+
+            case value_t::array:
+            {
+                m_it.array_iterator = m_object->m_value.array->begin();
+                break;
+            }
+
+            case value_t::null:
+            {
+                // set to end so begin()==end() is true: null is empty
+                m_it.primitive_iterator.set_end();
+                break;
+            }
+
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                m_it.primitive_iterator.set_begin();
+                break;
+            }
+        }
+    }
+
+    /*!
+    @brief set the iterator past the last value
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    void set_end() noexcept
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+            {
+                m_it.object_iterator = m_object->m_value.object->end();
+                break;
+            }
+
+            case value_t::array:
+            {
+                m_it.array_iterator = m_object->m_value.array->end();
+                break;
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                m_it.primitive_iterator.set_end();
+                break;
+            }
+        }
+    }
+
+  public:
+    /*!
+    @brief return a reference to the value pointed to by the iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    reference operator*() const
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+            {
+                JSON_ASSERT(m_it.object_iterator != m_object->m_value.object->end());
+                return m_it.object_iterator->second;
+            }
+
+            case value_t::array:
+            {
+                JSON_ASSERT(m_it.array_iterator != m_object->m_value.array->end());
+                return *m_it.array_iterator;
+            }
+
+            case value_t::null:
+                JSON_THROW(invalid_iterator::create(214, "cannot get value", *m_object));
+
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                if (JSON_HEDLEY_LIKELY(m_it.primitive_iterator.is_begin()))
+                {
+                    return *m_object;
+                }
+
+                JSON_THROW(invalid_iterator::create(214, "cannot get value", *m_object));
+            }
+        }
+    }
+
+    /*!
+    @brief dereference the iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    pointer operator->() const
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+            {
+                JSON_ASSERT(m_it.object_iterator != m_object->m_value.object->end());
+                return &(m_it.object_iterator->second);
+            }
+
+            case value_t::array:
+            {
+                JSON_ASSERT(m_it.array_iterator != m_object->m_value.array->end());
+                return &*m_it.array_iterator;
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                if (JSON_HEDLEY_LIKELY(m_it.primitive_iterator.is_begin()))
+                {
+                    return m_object;
+                }
+
+                JSON_THROW(invalid_iterator::create(214, "cannot get value", *m_object));
+            }
+        }
+    }
+
+    /*!
+    @brief post-increment (it++)
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    iter_impl const operator++(int) // NOLINT(readability-const-return-type)
+    {
+        auto result = *this;
+        ++(*this);
+        return result;
+    }
+
+    /*!
+    @brief pre-increment (++it)
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    iter_impl& operator++()
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+            {
+                std::advance(m_it.object_iterator, 1);
+                break;
+            }
+
+            case value_t::array:
+            {
+                std::advance(m_it.array_iterator, 1);
+                break;
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                ++m_it.primitive_iterator;
+                break;
+            }
+        }
+
+        return *this;
+    }
+
+    /*!
+    @brief post-decrement (it--)
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    iter_impl const operator--(int) // NOLINT(readability-const-return-type)
+    {
+        auto result = *this;
+        --(*this);
+        return result;
+    }
+
+    /*!
+    @brief pre-decrement (--it)
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    iter_impl& operator--()
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+            {
+                std::advance(m_it.object_iterator, -1);
+                break;
+            }
+
+            case value_t::array:
+            {
+                std::advance(m_it.array_iterator, -1);
+                break;
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                --m_it.primitive_iterator;
+                break;
+            }
+        }
+
+        return *this;
+    }
+
+    /*!
+    @brief comparison: equal
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    template < typename IterImpl, detail::enable_if_t < (std::is_same<IterImpl, iter_impl>::value || std::is_same<IterImpl, other_iter_impl>::value), std::nullptr_t > = nullptr >
+    bool operator==(const IterImpl& other) const
+    {
+        // if objects are not the same, the comparison is undefined
+        if (JSON_HEDLEY_UNLIKELY(m_object != other.m_object))
+        {
+            JSON_THROW(invalid_iterator::create(212, "cannot compare iterators of different containers", *m_object));
+        }
+
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+                return (m_it.object_iterator == other.m_it.object_iterator);
+
+            case value_t::array:
+                return (m_it.array_iterator == other.m_it.array_iterator);
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+                return (m_it.primitive_iterator == other.m_it.primitive_iterator);
+        }
+    }
+
+    /*!
+    @brief comparison: not equal
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    template < typename IterImpl, detail::enable_if_t < (std::is_same<IterImpl, iter_impl>::value || std::is_same<IterImpl, other_iter_impl>::value), std::nullptr_t > = nullptr >
+    bool operator!=(const IterImpl& other) const
+    {
+        return !operator==(other);
+    }
+
+    /*!
+    @brief comparison: smaller
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    bool operator<(const iter_impl& other) const
+    {
+        // if objects are not the same, the comparison is undefined
+        if (JSON_HEDLEY_UNLIKELY(m_object != other.m_object))
+        {
+            JSON_THROW(invalid_iterator::create(212, "cannot compare iterators of different containers", *m_object));
+        }
+
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+                JSON_THROW(invalid_iterator::create(213, "cannot compare order of object iterators", *m_object));
+
+            case value_t::array:
+                return (m_it.array_iterator < other.m_it.array_iterator);
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+                return (m_it.primitive_iterator < other.m_it.primitive_iterator);
+        }
+    }
+
+    /*!
+    @brief comparison: less than or equal
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    bool operator<=(const iter_impl& other) const
+    {
+        return !other.operator < (*this);
+    }
+
+    /*!
+    @brief comparison: greater than
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    bool operator>(const iter_impl& other) const
+    {
+        return !operator<=(other);
+    }
+
+    /*!
+    @brief comparison: greater than or equal
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    bool operator>=(const iter_impl& other) const
+    {
+        return !operator<(other);
+    }
+
+    /*!
+    @brief add to iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    iter_impl& operator+=(difference_type i)
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+                JSON_THROW(invalid_iterator::create(209, "cannot use offsets with object iterators", *m_object));
+
+            case value_t::array:
+            {
+                std::advance(m_it.array_iterator, i);
+                break;
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                m_it.primitive_iterator += i;
+                break;
+            }
+        }
+
+        return *this;
+    }
+
+    /*!
+    @brief subtract from iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    iter_impl& operator-=(difference_type i)
+    {
+        return operator+=(-i);
+    }
+
+    /*!
+    @brief add to iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    iter_impl operator+(difference_type i) const
+    {
+        auto result = *this;
+        result += i;
+        return result;
+    }
+
+    /*!
+    @brief addition of distance and iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    friend iter_impl operator+(difference_type i, const iter_impl& it)
+    {
+        auto result = it;
+        result += i;
+        return result;
+    }
+
+    /*!
+    @brief subtract from iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    iter_impl operator-(difference_type i) const
+    {
+        auto result = *this;
+        result -= i;
+        return result;
+    }
+
+    /*!
+    @brief return difference
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    difference_type operator-(const iter_impl& other) const
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+                JSON_THROW(invalid_iterator::create(209, "cannot use offsets with object iterators", *m_object));
+
+            case value_t::array:
+                return m_it.array_iterator - other.m_it.array_iterator;
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+                return m_it.primitive_iterator - other.m_it.primitive_iterator;
+        }
+    }
+
+    /*!
+    @brief access to successor
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    reference operator[](difference_type n) const
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        switch (m_object->m_type)
+        {
+            case value_t::object:
+                JSON_THROW(invalid_iterator::create(208, "cannot use operator[] for object iterators", *m_object));
+
+            case value_t::array:
+                return *std::next(m_it.array_iterator, n);
+
+            case value_t::null:
+                JSON_THROW(invalid_iterator::create(214, "cannot get value", *m_object));
+
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                if (JSON_HEDLEY_LIKELY(m_it.primitive_iterator.get_value() == -n))
+                {
+                    return *m_object;
+                }
+
+                JSON_THROW(invalid_iterator::create(214, "cannot get value", *m_object));
+            }
+        }
+    }
+
+    /*!
+    @brief return the key of an object iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    const typename object_t::key_type& key() const
+    {
+        JSON_ASSERT(m_object != nullptr);
+
+        if (JSON_HEDLEY_LIKELY(m_object->is_object()))
+        {
+            return m_it.object_iterator->first;
+        }
+
+        JSON_THROW(invalid_iterator::create(207, "cannot use key() for non-object iterators", *m_object));
+    }
+
+    /*!
+    @brief return the value of an iterator
+    @pre The iterator is initialized; i.e. `m_object != nullptr`.
+    */
+    reference value() const
+    {
+        return operator*();
+    }
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    /// associated JSON instance
+    pointer m_object = nullptr;
+    /// the actual iterator of the associated instance
+    internal_iterator<typename std::remove_const<BasicJsonType>::type> m_it {};
+};
+} // namespace detail
+} // namespace nlohmann
+
+// #include <nlohmann/detail/iterators/iteration_proxy.hpp>
+
+// #include <nlohmann/detail/iterators/json_reverse_iterator.hpp>
+
+
+#include <cstddef> // ptrdiff_t
+#include <iterator> // reverse_iterator
+#include <utility> // declval
+
+namespace nlohmann
+{
+namespace detail
+{
+//////////////////////
+// reverse_iterator //
+//////////////////////
+
+/*!
+@brief a template for a reverse iterator class
+
+@tparam Base the base iterator type to reverse. Valid types are @ref
+iterator (to create @ref reverse_iterator) and @ref const_iterator (to
+create @ref const_reverse_iterator).
+
+@requirement The class satisfies the following concept requirements:
+-
+[BidirectionalIterator](https://en.cppreference.com/w/cpp/named_req/BidirectionalIterator):
+  The iterator that can be moved can be moved in both directions (i.e.
+  incremented and decremented).
+- [OutputIterator](https://en.cppreference.com/w/cpp/named_req/OutputIterator):
+  It is possible to write to the pointed-to element (only if @a Base is
+  @ref iterator).
+
+@since version 1.0.0
+*/
+template<typename Base>
+class json_reverse_iterator : public std::reverse_iterator<Base>
+{
+  public:
+    using difference_type = std::ptrdiff_t;
+    /// shortcut to the reverse iterator adapter
+    using base_iterator = std::reverse_iterator<Base>;
+    /// the reference type for the pointed-to element
+    using reference = typename Base::reference;
+
+    /// create reverse iterator from iterator
+    explicit json_reverse_iterator(const typename base_iterator::iterator_type& it) noexcept
+        : base_iterator(it) {}
+
+    /// create reverse iterator from base class
+    explicit json_reverse_iterator(const base_iterator& it) noexcept : base_iterator(it) {}
+
+    /// post-increment (it++)
+    json_reverse_iterator const operator++(int) // NOLINT(readability-const-return-type)
+    {
+        return static_cast<json_reverse_iterator>(base_iterator::operator++(1));
+    }
+
+    /// pre-increment (++it)
+    json_reverse_iterator& operator++()
+    {
+        return static_cast<json_reverse_iterator&>(base_iterator::operator++());
+    }
+
+    /// post-decrement (it--)
+    json_reverse_iterator const operator--(int) // NOLINT(readability-const-return-type)
+    {
+        return static_cast<json_reverse_iterator>(base_iterator::operator--(1));
+    }
+
+    /// pre-decrement (--it)
+    json_reverse_iterator& operator--()
+    {
+        return static_cast<json_reverse_iterator&>(base_iterator::operator--());
+    }
+
+    /// add to iterator
+    json_reverse_iterator& operator+=(difference_type i)
+    {
+        return static_cast<json_reverse_iterator&>(base_iterator::operator+=(i));
+    }
+
+    /// add to iterator
+    json_reverse_iterator operator+(difference_type i) const
+    {
+        return static_cast<json_reverse_iterator>(base_iterator::operator+(i));
+    }
+
+    /// subtract from iterator
+    json_reverse_iterator operator-(difference_type i) const
+    {
+        return static_cast<json_reverse_iterator>(base_iterator::operator-(i));
+    }
+
+    /// return difference
+    difference_type operator-(const json_reverse_iterator& other) const
+    {
+        return base_iterator(*this) - base_iterator(other);
+    }
+
+    /// access to successor
+    reference operator[](difference_type n) const
+    {
+        return *(this->operator+(n));
+    }
+
+    /// return the key of an object iterator
+    auto key() const -> decltype(std::declval<Base>().key())
+    {
+        auto it = --this->base();
+        return it.key();
+    }
+
+    /// return the value of an iterator
+    reference value() const
+    {
+        auto it = --this->base();
+        return it.operator * ();
+    }
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/iterators/primitive_iterator.hpp>
+
+// #include <nlohmann/detail/json_pointer.hpp>
+
+
+#include <algorithm> // all_of
+#include <cctype> // isdigit
+#include <limits> // max
+#include <numeric> // accumulate
+#include <string> // string
+#include <utility> // move
+#include <vector> // vector
+
+// #include <nlohmann/detail/exceptions.hpp>
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/string_escape.hpp>
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+namespace nlohmann
+{
+
+/// @brief JSON Pointer defines a string syntax for identifying a specific value within a JSON document
+/// @sa https://json.nlohmann.me/api/json_pointer/
+template<typename BasicJsonType>
+class json_pointer
+{
+    // allow basic_json to access private members
+    NLOHMANN_BASIC_JSON_TPL_DECLARATION
+    friend class basic_json;
+
+  public:
+    /// @brief create JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/json_pointer/
+    explicit json_pointer(const std::string& s = "")
+        : reference_tokens(split(s))
+    {}
+
+    /// @brief return a string representation of the JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/to_string/
+    std::string to_string() const
+    {
+        return std::accumulate(reference_tokens.begin(), reference_tokens.end(),
+                               std::string{},
+                               [](const std::string & a, const std::string & b)
+        {
+            return a + "/" + detail::escape(b);
+        });
+    }
+
+    /// @brief return a string representation of the JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/operator_string/
+    operator std::string() const
+    {
+        return to_string();
+    }
+
+    /// @brief append another JSON pointer at the end of this JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/operator_slasheq/
+    json_pointer& operator/=(const json_pointer& ptr)
+    {
+        reference_tokens.insert(reference_tokens.end(),
+                                ptr.reference_tokens.begin(),
+                                ptr.reference_tokens.end());
+        return *this;
+    }
+
+    /// @brief append an unescaped reference token at the end of this JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/operator_slasheq/
+    json_pointer& operator/=(std::string token)
+    {
+        push_back(std::move(token));
+        return *this;
+    }
+
+    /// @brief append an array index at the end of this JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/operator_slasheq/
+    json_pointer& operator/=(std::size_t array_idx)
+    {
+        return *this /= std::to_string(array_idx);
+    }
+
+    /// @brief create a new JSON pointer by appending the right JSON pointer at the end of the left JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/operator_slash/
+    friend json_pointer operator/(const json_pointer& lhs,
+                                  const json_pointer& rhs)
+    {
+        return json_pointer(lhs) /= rhs;
+    }
+
+    /// @brief create a new JSON pointer by appending the unescaped token at the end of the JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/operator_slash/
+    friend json_pointer operator/(const json_pointer& lhs, std::string token) // NOLINT(performance-unnecessary-value-param)
+    {
+        return json_pointer(lhs) /= std::move(token);
+    }
+
+    /// @brief create a new JSON pointer by appending the array-index-token at the end of the JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/operator_slash/
+    friend json_pointer operator/(const json_pointer& lhs, std::size_t array_idx)
+    {
+        return json_pointer(lhs) /= array_idx;
+    }
+
+    /// @brief returns the parent of this JSON pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/parent_pointer/
+    json_pointer parent_pointer() const
+    {
+        if (empty())
+        {
+            return *this;
+        }
+
+        json_pointer res = *this;
+        res.pop_back();
+        return res;
+    }
+
+    /// @brief remove last reference token
+    /// @sa https://json.nlohmann.me/api/json_pointer/pop_back/
+    void pop_back()
+    {
+        if (JSON_HEDLEY_UNLIKELY(empty()))
+        {
+            JSON_THROW(detail::out_of_range::create(405, "JSON pointer has no parent", BasicJsonType()));
+        }
+
+        reference_tokens.pop_back();
+    }
+
+    /// @brief return last reference token
+    /// @sa https://json.nlohmann.me/api/json_pointer/back/
+    const std::string& back() const
+    {
+        if (JSON_HEDLEY_UNLIKELY(empty()))
+        {
+            JSON_THROW(detail::out_of_range::create(405, "JSON pointer has no parent", BasicJsonType()));
+        }
+
+        return reference_tokens.back();
+    }
+
+    /// @brief append an unescaped token at the end of the reference pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/push_back/
+    void push_back(const std::string& token)
+    {
+        reference_tokens.push_back(token);
+    }
+
+    /// @brief append an unescaped token at the end of the reference pointer
+    /// @sa https://json.nlohmann.me/api/json_pointer/push_back/
+    void push_back(std::string&& token)
+    {
+        reference_tokens.push_back(std::move(token));
+    }
+
+    /// @brief return whether pointer points to the root document
+    /// @sa https://json.nlohmann.me/api/json_pointer/empty/
+    bool empty() const noexcept
+    {
+        return reference_tokens.empty();
+    }
+
+  private:
+    /*!
+    @param[in] s  reference token to be converted into an array index
+
+    @return integer representation of @a s
+
+    @throw parse_error.106  if an array index begins with '0'
+    @throw parse_error.109  if an array index begins not with a digit
+    @throw out_of_range.404 if string @a s could not be converted to an integer
+    @throw out_of_range.410 if an array index exceeds size_type
+    */
+    static typename BasicJsonType::size_type array_index(const std::string& s)
+    {
+        using size_type = typename BasicJsonType::size_type;
+
+        // error condition (cf. RFC 6901, Sect. 4)
+        if (JSON_HEDLEY_UNLIKELY(s.size() > 1 && s[0] == '0'))
+        {
+            JSON_THROW(detail::parse_error::create(106, 0, "array index '" + s + "' must not begin with '0'", BasicJsonType()));
+        }
+
+        // error condition (cf. RFC 6901, Sect. 4)
+        if (JSON_HEDLEY_UNLIKELY(s.size() > 1 && !(s[0] >= '1' && s[0] <= '9')))
+        {
+            JSON_THROW(detail::parse_error::create(109, 0, "array index '" + s + "' is not a number", BasicJsonType()));
+        }
+
+        std::size_t processed_chars = 0;
+        unsigned long long res = 0;  // NOLINT(runtime/int)
+        JSON_TRY
+        {
+            res = std::stoull(s, &processed_chars);
+        }
+        JSON_CATCH(std::out_of_range&)
+        {
+            JSON_THROW(detail::out_of_range::create(404, "unresolved reference token '" + s + "'", BasicJsonType()));
+        }
+
+        // check if the string was completely read
+        if (JSON_HEDLEY_UNLIKELY(processed_chars != s.size()))
+        {
+            JSON_THROW(detail::out_of_range::create(404, "unresolved reference token '" + s + "'", BasicJsonType()));
+        }
+
+        // only triggered on special platforms (like 32bit), see also
+        // https://github.com/nlohmann/json/pull/2203
+        if (res >= static_cast<unsigned long long>((std::numeric_limits<size_type>::max)()))  // NOLINT(runtime/int)
+        {
+            JSON_THROW(detail::out_of_range::create(410, "array index " + s + " exceeds size_type", BasicJsonType())); // LCOV_EXCL_LINE
+        }
+
+        return static_cast<size_type>(res);
+    }
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    json_pointer top() const
+    {
+        if (JSON_HEDLEY_UNLIKELY(empty()))
+        {
+            JSON_THROW(detail::out_of_range::create(405, "JSON pointer has no parent", BasicJsonType()));
+        }
+
+        json_pointer result = *this;
+        result.reference_tokens = {reference_tokens[0]};
+        return result;
+    }
+
+  private:
+    /*!
+    @brief create and return a reference to the pointed to value
+
+    @complexity Linear in the number of reference tokens.
+
+    @throw parse_error.109 if array index is not a number
+    @throw type_error.313 if value cannot be unflattened
+    */
+    BasicJsonType& get_and_create(BasicJsonType& j) const
+    {
+        auto* result = &j;
+
+        // in case no reference tokens exist, return a reference to the JSON value
+        // j which will be overwritten by a primitive value
+        for (const auto& reference_token : reference_tokens)
+        {
+            switch (result->type())
+            {
+                case detail::value_t::null:
+                {
+                    if (reference_token == "0")
+                    {
+                        // start a new array if reference token is 0
+                        result = &result->operator[](0);
+                    }
+                    else
+                    {
+                        // start a new object otherwise
+                        result = &result->operator[](reference_token);
+                    }
+                    break;
+                }
+
+                case detail::value_t::object:
+                {
+                    // create an entry in the object
+                    result = &result->operator[](reference_token);
+                    break;
+                }
+
+                case detail::value_t::array:
+                {
+                    // create an entry in the array
+                    result = &result->operator[](array_index(reference_token));
+                    break;
+                }
+
+                /*
+                The following code is only reached if there exists a reference
+                token _and_ the current value is primitive. In this case, we have
+                an error situation, because primitive values may only occur as
+                single value; that is, with an empty list of reference tokens.
+                */
+                case detail::value_t::string:
+                case detail::value_t::boolean:
+                case detail::value_t::number_integer:
+                case detail::value_t::number_unsigned:
+                case detail::value_t::number_float:
+                case detail::value_t::binary:
+                case detail::value_t::discarded:
+                default:
+                    JSON_THROW(detail::type_error::create(313, "invalid value to unflatten", j));
+            }
+        }
+
+        return *result;
+    }
+
+    /*!
+    @brief return a reference to the pointed to value
+
+    @note This version does not throw if a value is not present, but tries to
+          create nested values instead. For instance, calling this function
+          with pointer `"/this/that"` on a null value is equivalent to calling
+          `operator[]("this").operator[]("that")` on that value, effectively
+          changing the null value to an object.
+
+    @param[in] ptr  a JSON value
+
+    @return reference to the JSON value pointed to by the JSON pointer
+
+    @complexity Linear in the length of the JSON pointer.
+
+    @throw parse_error.106   if an array index begins with '0'
+    @throw parse_error.109   if an array index was not a number
+    @throw out_of_range.404  if the JSON pointer can not be resolved
+    */
+    BasicJsonType& get_unchecked(BasicJsonType* ptr) const
+    {
+        for (const auto& reference_token : reference_tokens)
+        {
+            // convert null values to arrays or objects before continuing
+            if (ptr->is_null())
+            {
+                // check if reference token is a number
+                const bool nums =
+                    std::all_of(reference_token.begin(), reference_token.end(),
+                                [](const unsigned char x)
+                {
+                    return std::isdigit(x);
+                });
+
+                // change value to array for numbers or "-" or to object otherwise
+                *ptr = (nums || reference_token == "-")
+                       ? detail::value_t::array
+                       : detail::value_t::object;
+            }
+
+            switch (ptr->type())
+            {
+                case detail::value_t::object:
+                {
+                    // use unchecked object access
+                    ptr = &ptr->operator[](reference_token);
+                    break;
+                }
+
+                case detail::value_t::array:
+                {
+                    if (reference_token == "-")
+                    {
+                        // explicitly treat "-" as index beyond the end
+                        ptr = &ptr->operator[](ptr->m_value.array->size());
+                    }
+                    else
+                    {
+                        // convert array index to number; unchecked access
+                        ptr = &ptr->operator[](array_index(reference_token));
+                    }
+                    break;
+                }
+
+                case detail::value_t::null:
+                case detail::value_t::string:
+                case detail::value_t::boolean:
+                case detail::value_t::number_integer:
+                case detail::value_t::number_unsigned:
+                case detail::value_t::number_float:
+                case detail::value_t::binary:
+                case detail::value_t::discarded:
+                default:
+                    JSON_THROW(detail::out_of_range::create(404, "unresolved reference token '" + reference_token + "'", *ptr));
+            }
+        }
+
+        return *ptr;
+    }
+
+    /*!
+    @throw parse_error.106   if an array index begins with '0'
+    @throw parse_error.109   if an array index was not a number
+    @throw out_of_range.402  if the array index '-' is used
+    @throw out_of_range.404  if the JSON pointer can not be resolved
+    */
+    BasicJsonType& get_checked(BasicJsonType* ptr) const
+    {
+        for (const auto& reference_token : reference_tokens)
+        {
+            switch (ptr->type())
+            {
+                case detail::value_t::object:
+                {
+                    // note: at performs range check
+                    ptr = &ptr->at(reference_token);
+                    break;
+                }
+
+                case detail::value_t::array:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(reference_token == "-"))
+                    {
+                        // "-" always fails the range check
+                        JSON_THROW(detail::out_of_range::create(402,
+                                                                "array index '-' (" + std::to_string(ptr->m_value.array->size()) +
+                                                                ") is out of range", *ptr));
+                    }
+
+                    // note: at performs range check
+                    ptr = &ptr->at(array_index(reference_token));
+                    break;
+                }
+
+                case detail::value_t::null:
+                case detail::value_t::string:
+                case detail::value_t::boolean:
+                case detail::value_t::number_integer:
+                case detail::value_t::number_unsigned:
+                case detail::value_t::number_float:
+                case detail::value_t::binary:
+                case detail::value_t::discarded:
+                default:
+                    JSON_THROW(detail::out_of_range::create(404, "unresolved reference token '" + reference_token + "'", *ptr));
+            }
+        }
+
+        return *ptr;
+    }
+
+    /*!
+    @brief return a const reference to the pointed to value
+
+    @param[in] ptr  a JSON value
+
+    @return const reference to the JSON value pointed to by the JSON
+    pointer
+
+    @throw parse_error.106   if an array index begins with '0'
+    @throw parse_error.109   if an array index was not a number
+    @throw out_of_range.402  if the array index '-' is used
+    @throw out_of_range.404  if the JSON pointer can not be resolved
+    */
+    const BasicJsonType& get_unchecked(const BasicJsonType* ptr) const
+    {
+        for (const auto& reference_token : reference_tokens)
+        {
+            switch (ptr->type())
+            {
+                case detail::value_t::object:
+                {
+                    // use unchecked object access
+                    ptr = &ptr->operator[](reference_token);
+                    break;
+                }
+
+                case detail::value_t::array:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(reference_token == "-"))
+                    {
+                        // "-" cannot be used for const access
+                        JSON_THROW(detail::out_of_range::create(402, "array index '-' (" + std::to_string(ptr->m_value.array->size()) + ") is out of range", *ptr));
+                    }
+
+                    // use unchecked array access
+                    ptr = &ptr->operator[](array_index(reference_token));
+                    break;
+                }
+
+                case detail::value_t::null:
+                case detail::value_t::string:
+                case detail::value_t::boolean:
+                case detail::value_t::number_integer:
+                case detail::value_t::number_unsigned:
+                case detail::value_t::number_float:
+                case detail::value_t::binary:
+                case detail::value_t::discarded:
+                default:
+                    JSON_THROW(detail::out_of_range::create(404, "unresolved reference token '" + reference_token + "'", *ptr));
+            }
+        }
+
+        return *ptr;
+    }
+
+    /*!
+    @throw parse_error.106   if an array index begins with '0'
+    @throw parse_error.109   if an array index was not a number
+    @throw out_of_range.402  if the array index '-' is used
+    @throw out_of_range.404  if the JSON pointer can not be resolved
+    */
+    const BasicJsonType& get_checked(const BasicJsonType* ptr) const
+    {
+        for (const auto& reference_token : reference_tokens)
+        {
+            switch (ptr->type())
+            {
+                case detail::value_t::object:
+                {
+                    // note: at performs range check
+                    ptr = &ptr->at(reference_token);
+                    break;
+                }
+
+                case detail::value_t::array:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(reference_token == "-"))
+                    {
+                        // "-" always fails the range check
+                        JSON_THROW(detail::out_of_range::create(402,
+                                                                "array index '-' (" + std::to_string(ptr->m_value.array->size()) +
+                                                                ") is out of range", *ptr));
+                    }
+
+                    // note: at performs range check
+                    ptr = &ptr->at(array_index(reference_token));
+                    break;
+                }
+
+                case detail::value_t::null:
+                case detail::value_t::string:
+                case detail::value_t::boolean:
+                case detail::value_t::number_integer:
+                case detail::value_t::number_unsigned:
+                case detail::value_t::number_float:
+                case detail::value_t::binary:
+                case detail::value_t::discarded:
+                default:
+                    JSON_THROW(detail::out_of_range::create(404, "unresolved reference token '" + reference_token + "'", *ptr));
+            }
+        }
+
+        return *ptr;
+    }
+
+    /*!
+    @throw parse_error.106   if an array index begins with '0'
+    @throw parse_error.109   if an array index was not a number
+    */
+    bool contains(const BasicJsonType* ptr) const
+    {
+        for (const auto& reference_token : reference_tokens)
+        {
+            switch (ptr->type())
+            {
+                case detail::value_t::object:
+                {
+                    if (!ptr->contains(reference_token))
+                    {
+                        // we did not find the key in the object
+                        return false;
+                    }
+
+                    ptr = &ptr->operator[](reference_token);
+                    break;
+                }
+
+                case detail::value_t::array:
+                {
+                    if (JSON_HEDLEY_UNLIKELY(reference_token == "-"))
+                    {
+                        // "-" always fails the range check
+                        return false;
+                    }
+                    if (JSON_HEDLEY_UNLIKELY(reference_token.size() == 1 && !("0" <= reference_token && reference_token <= "9")))
+                    {
+                        // invalid char
+                        return false;
+                    }
+                    if (JSON_HEDLEY_UNLIKELY(reference_token.size() > 1))
+                    {
+                        if (JSON_HEDLEY_UNLIKELY(!('1' <= reference_token[0] && reference_token[0] <= '9')))
+                        {
+                            // first char should be between '1' and '9'
+                            return false;
+                        }
+                        for (std::size_t i = 1; i < reference_token.size(); i++)
+                        {
+                            if (JSON_HEDLEY_UNLIKELY(!('0' <= reference_token[i] && reference_token[i] <= '9')))
+                            {
+                                // other char should be between '0' and '9'
+                                return false;
+                            }
+                        }
+                    }
+
+                    const auto idx = array_index(reference_token);
+                    if (idx >= ptr->size())
+                    {
+                        // index out of range
+                        return false;
+                    }
+
+                    ptr = &ptr->operator[](idx);
+                    break;
+                }
+
+                case detail::value_t::null:
+                case detail::value_t::string:
+                case detail::value_t::boolean:
+                case detail::value_t::number_integer:
+                case detail::value_t::number_unsigned:
+                case detail::value_t::number_float:
+                case detail::value_t::binary:
+                case detail::value_t::discarded:
+                default:
+                {
+                    // we do not expect primitive values if there is still a
+                    // reference token to process
+                    return false;
+                }
+            }
+        }
+
+        // no reference token left means we found a primitive value
+        return true;
+    }
+
+    /*!
+    @brief split the string input to reference tokens
+
+    @note This function is only called by the json_pointer constructor.
+          All exceptions below are documented there.
+
+    @throw parse_error.107  if the pointer is not empty or begins with '/'
+    @throw parse_error.108  if character '~' is not followed by '0' or '1'
+    */
+    static std::vector<std::string> split(const std::string& reference_string)
+    {
+        std::vector<std::string> result;
+
+        // special case: empty reference string -> no reference tokens
+        if (reference_string.empty())
+        {
+            return result;
+        }
+
+        // check if nonempty reference string begins with slash
+        if (JSON_HEDLEY_UNLIKELY(reference_string[0] != '/'))
+        {
+            JSON_THROW(detail::parse_error::create(107, 1, "JSON pointer must be empty or begin with '/' - was: '" + reference_string + "'", BasicJsonType()));
+        }
+
+        // extract the reference tokens:
+        // - slash: position of the last read slash (or end of string)
+        // - start: position after the previous slash
+        for (
+            // search for the first slash after the first character
+            std::size_t slash = reference_string.find_first_of('/', 1),
+            // set the beginning of the first reference token
+            start = 1;
+            // we can stop if start == 0 (if slash == std::string::npos)
+            start != 0;
+            // set the beginning of the next reference token
+            // (will eventually be 0 if slash == std::string::npos)
+            start = (slash == std::string::npos) ? 0 : slash + 1,
+            // find next slash
+            slash = reference_string.find_first_of('/', start))
+        {
+            // use the text between the beginning of the reference token
+            // (start) and the last slash (slash).
+            auto reference_token = reference_string.substr(start, slash - start);
+
+            // check reference tokens are properly escaped
+            for (std::size_t pos = reference_token.find_first_of('~');
+                    pos != std::string::npos;
+                    pos = reference_token.find_first_of('~', pos + 1))
+            {
+                JSON_ASSERT(reference_token[pos] == '~');
+
+                // ~ must be followed by 0 or 1
+                if (JSON_HEDLEY_UNLIKELY(pos == reference_token.size() - 1 ||
+                                         (reference_token[pos + 1] != '0' &&
+                                          reference_token[pos + 1] != '1')))
+                {
+                    JSON_THROW(detail::parse_error::create(108, 0, "escape character '~' must be followed with '0' or '1'", BasicJsonType()));
+                }
+            }
+
+            // finally, store the reference token
+            detail::unescape(reference_token);
+            result.push_back(reference_token);
+        }
+
+        return result;
+    }
+
+  private:
+    /*!
+    @param[in] reference_string  the reference string to the current value
+    @param[in] value             the value to consider
+    @param[in,out] result        the result object to insert values to
+
+    @note Empty objects or arrays are flattened to `null`.
+    */
+    static void flatten(const std::string& reference_string,
+                        const BasicJsonType& value,
+                        BasicJsonType& result)
+    {
+        switch (value.type())
+        {
+            case detail::value_t::array:
+            {
+                if (value.m_value.array->empty())
+                {
+                    // flatten empty array as null
+                    result[reference_string] = nullptr;
+                }
+                else
+                {
+                    // iterate array and use index as reference string
+                    for (std::size_t i = 0; i < value.m_value.array->size(); ++i)
+                    {
+                        flatten(reference_string + "/" + std::to_string(i),
+                                value.m_value.array->operator[](i), result);
+                    }
+                }
+                break;
+            }
+
+            case detail::value_t::object:
+            {
+                if (value.m_value.object->empty())
+                {
+                    // flatten empty object as null
+                    result[reference_string] = nullptr;
+                }
+                else
+                {
+                    // iterate object and use keys as reference string
+                    for (const auto& element : *value.m_value.object)
+                    {
+                        flatten(reference_string + "/" + detail::escape(element.first), element.second, result);
+                    }
+                }
+                break;
+            }
+
+            case detail::value_t::null:
+            case detail::value_t::string:
+            case detail::value_t::boolean:
+            case detail::value_t::number_integer:
+            case detail::value_t::number_unsigned:
+            case detail::value_t::number_float:
+            case detail::value_t::binary:
+            case detail::value_t::discarded:
+            default:
+            {
+                // add primitive value with its reference string
+                result[reference_string] = value;
+                break;
+            }
+        }
+    }
+
+    /*!
+    @param[in] value  flattened JSON
+
+    @return unflattened JSON
+
+    @throw parse_error.109 if array index is not a number
+    @throw type_error.314  if value is not an object
+    @throw type_error.315  if object values are not primitive
+    @throw type_error.313  if value cannot be unflattened
+    */
+    static BasicJsonType
+    unflatten(const BasicJsonType& value)
+    {
+        if (JSON_HEDLEY_UNLIKELY(!value.is_object()))
+        {
+            JSON_THROW(detail::type_error::create(314, "only objects can be unflattened", value));
+        }
+
+        BasicJsonType result;
+
+        // iterate the JSON object values
+        for (const auto& element : *value.m_value.object)
+        {
+            if (JSON_HEDLEY_UNLIKELY(!element.second.is_primitive()))
+            {
+                JSON_THROW(detail::type_error::create(315, "values in object must be primitive", element.second));
+            }
+
+            // assign value to reference pointed to by JSON pointer; Note that if
+            // the JSON pointer is "" (i.e., points to the whole value), function
+            // get_and_create returns a reference to result itself. An assignment
+            // will then create a primitive value.
+            json_pointer(element.first).get_and_create(result) = element.second;
+        }
+
+        return result;
+    }
+
+    /*!
+    @brief compares two JSON pointers for equality
+
+    @param[in] lhs  JSON pointer to compare
+    @param[in] rhs  JSON pointer to compare
+    @return whether @a lhs is equal to @a rhs
+
+    @complexity Linear in the length of the JSON pointer
+
+    @exceptionsafety No-throw guarantee: this function never throws exceptions.
+    */
+    friend bool operator==(json_pointer const& lhs,
+                           json_pointer const& rhs) noexcept
+    {
+        return lhs.reference_tokens == rhs.reference_tokens;
+    }
+
+    /*!
+    @brief compares two JSON pointers for inequality
+
+    @param[in] lhs  JSON pointer to compare
+    @param[in] rhs  JSON pointer to compare
+    @return whether @a lhs is not equal @a rhs
+
+    @complexity Linear in the length of the JSON pointer
+
+    @exceptionsafety No-throw guarantee: this function never throws exceptions.
+    */
+    friend bool operator!=(json_pointer const& lhs,
+                           json_pointer const& rhs) noexcept
+    {
+        return !(lhs == rhs);
+    }
+
+    /// the reference tokens
+    std::vector<std::string> reference_tokens;
+};
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/json_ref.hpp>
+
+
+#include <initializer_list>
+#include <utility>
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+template<typename BasicJsonType>
+class json_ref
+{
+  public:
+    using value_type = BasicJsonType;
+
+    json_ref(value_type&& value)
+        : owned_value(std::move(value))
+    {}
+
+    json_ref(const value_type& value)
+        : value_ref(&value)
+    {}
+
+    json_ref(std::initializer_list<json_ref> init)
+        : owned_value(init)
+    {}
+
+    template <
+        class... Args,
+        enable_if_t<std::is_constructible<value_type, Args...>::value, int> = 0 >
+    json_ref(Args && ... args)
+        : owned_value(std::forward<Args>(args)...)
+    {}
+
+    // class should be movable only
+    json_ref(json_ref&&) noexcept = default;
+    json_ref(const json_ref&) = delete;
+    json_ref& operator=(const json_ref&) = delete;
+    json_ref& operator=(json_ref&&) = delete;
+    ~json_ref() = default;
+
+    value_type moved_or_copied() const
+    {
+        if (value_ref == nullptr)
+        {
+            return std::move(owned_value);
+        }
+        return *value_ref;
+    }
+
+    value_type const& operator*() const
+    {
+        return value_ref ? *value_ref : owned_value;
+    }
+
+    value_type const* operator->() const
+    {
+        return &** this;
+    }
+
+  private:
+    mutable value_type owned_value = nullptr;
+    value_type const* value_ref = nullptr;
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/string_escape.hpp>
+
+// #include <nlohmann/detail/meta/cpp_future.hpp>
+
+// #include <nlohmann/detail/meta/type_traits.hpp>
+
+// #include <nlohmann/detail/output/binary_writer.hpp>
+
+
+#include <algorithm> // reverse
+#include <array> // array
+#include <cmath> // isnan, isinf
+#include <cstdint> // uint8_t, uint16_t, uint32_t, uint64_t
+#include <cstring> // memcpy
+#include <limits> // numeric_limits
+#include <string> // string
+#include <utility> // move
+
+// #include <nlohmann/detail/input/binary_reader.hpp>
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/output/output_adapters.hpp>
+
+
+#include <algorithm> // copy
+#include <cstddef> // size_t
+#include <iterator> // back_inserter
+#include <memory> // shared_ptr, make_shared
+#include <string> // basic_string
+#include <vector> // vector
+
+#ifndef JSON_NO_IO
+    #include <ios>      // streamsize
+    #include <ostream>  // basic_ostream
+#endif  // JSON_NO_IO
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+/// abstract output adapter interface
+template<typename CharType> struct output_adapter_protocol
+{
+    virtual void write_character(CharType c) = 0;
+    virtual void write_characters(const CharType* s, std::size_t length) = 0;
+    virtual ~output_adapter_protocol() = default;
+
+    output_adapter_protocol() = default;
+    output_adapter_protocol(const output_adapter_protocol&) = default;
+    output_adapter_protocol(output_adapter_protocol&&) noexcept = default;
+    output_adapter_protocol& operator=(const output_adapter_protocol&) = default;
+    output_adapter_protocol& operator=(output_adapter_protocol&&) noexcept = default;
+};
+
+/// a type to simplify interfaces
+template<typename CharType>
+using output_adapter_t = std::shared_ptr<output_adapter_protocol<CharType>>;
+
+/// output adapter for byte vectors
+template<typename CharType, typename AllocatorType = std::allocator<CharType>>
+class output_vector_adapter : public output_adapter_protocol<CharType>
+{
+  public:
+    explicit output_vector_adapter(std::vector<CharType, AllocatorType>& vec) noexcept
+        : v(vec)
+    {}
+
+    void write_character(CharType c) override
+    {
+        v.push_back(c);
+    }
+
+    JSON_HEDLEY_NON_NULL(2)
+    void write_characters(const CharType* s, std::size_t length) override
+    {
+        std::copy(s, s + length, std::back_inserter(v));
+    }
+
+  private:
+    std::vector<CharType, AllocatorType>& v;
+};
+
+#ifndef JSON_NO_IO
+/// output adapter for output streams
+template<typename CharType>
+class output_stream_adapter : public output_adapter_protocol<CharType>
+{
+  public:
+    explicit output_stream_adapter(std::basic_ostream<CharType>& s) noexcept
+        : stream(s)
+    {}
+
+    void write_character(CharType c) override
+    {
+        stream.put(c);
+    }
+
+    JSON_HEDLEY_NON_NULL(2)
+    void write_characters(const CharType* s, std::size_t length) override
+    {
+        stream.write(s, static_cast<std::streamsize>(length));
+    }
+
+  private:
+    std::basic_ostream<CharType>& stream;
+};
+#endif  // JSON_NO_IO
+
+/// output adapter for basic_string
+template<typename CharType, typename StringType = std::basic_string<CharType>>
+class output_string_adapter : public output_adapter_protocol<CharType>
+{
+  public:
+    explicit output_string_adapter(StringType& s) noexcept
+        : str(s)
+    {}
+
+    void write_character(CharType c) override
+    {
+        str.push_back(c);
+    }
+
+    JSON_HEDLEY_NON_NULL(2)
+    void write_characters(const CharType* s, std::size_t length) override
+    {
+        str.append(s, length);
+    }
+
+  private:
+    StringType& str;
+};
+
+template<typename CharType, typename StringType = std::basic_string<CharType>>
+class output_adapter
+{
+  public:
+    template<typename AllocatorType = std::allocator<CharType>>
+    output_adapter(std::vector<CharType, AllocatorType>& vec)
+        : oa(std::make_shared<output_vector_adapter<CharType, AllocatorType>>(vec)) {}
+
+#ifndef JSON_NO_IO
+    output_adapter(std::basic_ostream<CharType>& s)
+        : oa(std::make_shared<output_stream_adapter<CharType>>(s)) {}
+#endif  // JSON_NO_IO
+
+    output_adapter(StringType& s)
+        : oa(std::make_shared<output_string_adapter<CharType, StringType>>(s)) {}
+
+    operator output_adapter_t<CharType>()
+    {
+        return oa;
+    }
+
+  private:
+    output_adapter_t<CharType> oa = nullptr;
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+
+namespace nlohmann
+{
+namespace detail
+{
+///////////////////
+// binary writer //
+///////////////////
+
+/*!
+@brief serialization to CBOR and MessagePack values
+*/
+template<typename BasicJsonType, typename CharType>
+class binary_writer
+{
+    using string_t = typename BasicJsonType::string_t;
+    using binary_t = typename BasicJsonType::binary_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+
+  public:
+    /*!
+    @brief create a binary writer
+
+    @param[in] adapter  output adapter to write to
+    */
+    explicit binary_writer(output_adapter_t<CharType> adapter) : oa(std::move(adapter))
+    {
+        JSON_ASSERT(oa);
+    }
+
+    /*!
+    @param[in] j  JSON value to serialize
+    @pre       j.type() == value_t::object
+    */
+    void write_bson(const BasicJsonType& j)
+    {
+        switch (j.type())
+        {
+            case value_t::object:
+            {
+                write_bson_object(*j.m_value.object);
+                break;
+            }
+
+            case value_t::null:
+            case value_t::array:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                JSON_THROW(type_error::create(317, "to serialize to BSON, top-level type must be object, but is " + std::string(j.type_name()), j));
+            }
+        }
+    }
+
+    /*!
+    @param[in] j  JSON value to serialize
+    */
+    void write_cbor(const BasicJsonType& j)
+    {
+        switch (j.type())
+        {
+            case value_t::null:
+            {
+                oa->write_character(to_char_type(0xF6));
+                break;
+            }
+
+            case value_t::boolean:
+            {
+                oa->write_character(j.m_value.boolean
+                                    ? to_char_type(0xF5)
+                                    : to_char_type(0xF4));
+                break;
+            }
+
+            case value_t::number_integer:
+            {
+                if (j.m_value.number_integer >= 0)
+                {
+                    // CBOR does not differentiate between positive signed
+                    // integers and unsigned integers. Therefore, we used the
+                    // code from the value_t::number_unsigned case here.
+                    if (j.m_value.number_integer <= 0x17)
+                    {
+                        write_number(static_cast<std::uint8_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_integer <= (std::numeric_limits<std::uint8_t>::max)())
+                    {
+                        oa->write_character(to_char_type(0x18));
+                        write_number(static_cast<std::uint8_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_integer <= (std::numeric_limits<std::uint16_t>::max)())
+                    {
+                        oa->write_character(to_char_type(0x19));
+                        write_number(static_cast<std::uint16_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_integer <= (std::numeric_limits<std::uint32_t>::max)())
+                    {
+                        oa->write_character(to_char_type(0x1A));
+                        write_number(static_cast<std::uint32_t>(j.m_value.number_integer));
+                    }
+                    else
+                    {
+                        oa->write_character(to_char_type(0x1B));
+                        write_number(static_cast<std::uint64_t>(j.m_value.number_integer));
+                    }
+                }
+                else
+                {
+                    // The conversions below encode the sign in the first
+                    // byte, and the value is converted to a positive number.
+                    const auto positive_number = -1 - j.m_value.number_integer;
+                    if (j.m_value.number_integer >= -24)
+                    {
+                        write_number(static_cast<std::uint8_t>(0x20 + positive_number));
+                    }
+                    else if (positive_number <= (std::numeric_limits<std::uint8_t>::max)())
+                    {
+                        oa->write_character(to_char_type(0x38));
+                        write_number(static_cast<std::uint8_t>(positive_number));
+                    }
+                    else if (positive_number <= (std::numeric_limits<std::uint16_t>::max)())
+                    {
+                        oa->write_character(to_char_type(0x39));
+                        write_number(static_cast<std::uint16_t>(positive_number));
+                    }
+                    else if (positive_number <= (std::numeric_limits<std::uint32_t>::max)())
+                    {
+                        oa->write_character(to_char_type(0x3A));
+                        write_number(static_cast<std::uint32_t>(positive_number));
+                    }
+                    else
+                    {
+                        oa->write_character(to_char_type(0x3B));
+                        write_number(static_cast<std::uint64_t>(positive_number));
+                    }
+                }
+                break;
+            }
+
+            case value_t::number_unsigned:
+            {
+                if (j.m_value.number_unsigned <= 0x17)
+                {
+                    write_number(static_cast<std::uint8_t>(j.m_value.number_unsigned));
+                }
+                else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x18));
+                    write_number(static_cast<std::uint8_t>(j.m_value.number_unsigned));
+                }
+                else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x19));
+                    write_number(static_cast<std::uint16_t>(j.m_value.number_unsigned));
+                }
+                else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x1A));
+                    write_number(static_cast<std::uint32_t>(j.m_value.number_unsigned));
+                }
+                else
+                {
+                    oa->write_character(to_char_type(0x1B));
+                    write_number(static_cast<std::uint64_t>(j.m_value.number_unsigned));
+                }
+                break;
+            }
+
+            case value_t::number_float:
+            {
+                if (std::isnan(j.m_value.number_float))
+                {
+                    // NaN is 0xf97e00 in CBOR
+                    oa->write_character(to_char_type(0xF9));
+                    oa->write_character(to_char_type(0x7E));
+                    oa->write_character(to_char_type(0x00));
+                }
+                else if (std::isinf(j.m_value.number_float))
+                {
+                    // Infinity is 0xf97c00, -Infinity is 0xf9fc00
+                    oa->write_character(to_char_type(0xf9));
+                    oa->write_character(j.m_value.number_float > 0 ? to_char_type(0x7C) : to_char_type(0xFC));
+                    oa->write_character(to_char_type(0x00));
+                }
+                else
+                {
+                    write_compact_float(j.m_value.number_float, detail::input_format_t::cbor);
+                }
+                break;
+            }
+
+            case value_t::string:
+            {
+                // step 1: write control byte and the string length
+                const auto N = j.m_value.string->size();
+                if (N <= 0x17)
+                {
+                    write_number(static_cast<std::uint8_t>(0x60 + N));
+                }
+                else if (N <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x78));
+                    write_number(static_cast<std::uint8_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x79));
+                    write_number(static_cast<std::uint16_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x7A));
+                    write_number(static_cast<std::uint32_t>(N));
+                }
+                // LCOV_EXCL_START
+                else if (N <= (std::numeric_limits<std::uint64_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x7B));
+                    write_number(static_cast<std::uint64_t>(N));
+                }
+                // LCOV_EXCL_STOP
+
+                // step 2: write the string
+                oa->write_characters(
+                    reinterpret_cast<const CharType*>(j.m_value.string->c_str()),
+                    j.m_value.string->size());
+                break;
+            }
+
+            case value_t::array:
+            {
+                // step 1: write control byte and the array size
+                const auto N = j.m_value.array->size();
+                if (N <= 0x17)
+                {
+                    write_number(static_cast<std::uint8_t>(0x80 + N));
+                }
+                else if (N <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x98));
+                    write_number(static_cast<std::uint8_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x99));
+                    write_number(static_cast<std::uint16_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x9A));
+                    write_number(static_cast<std::uint32_t>(N));
+                }
+                // LCOV_EXCL_START
+                else if (N <= (std::numeric_limits<std::uint64_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x9B));
+                    write_number(static_cast<std::uint64_t>(N));
+                }
+                // LCOV_EXCL_STOP
+
+                // step 2: write each element
+                for (const auto& el : *j.m_value.array)
+                {
+                    write_cbor(el);
+                }
+                break;
+            }
+
+            case value_t::binary:
+            {
+                if (j.m_value.binary->has_subtype())
+                {
+                    if (j.m_value.binary->subtype() <= (std::numeric_limits<std::uint8_t>::max)())
+                    {
+                        write_number(static_cast<std::uint8_t>(0xd8));
+                        write_number(static_cast<std::uint8_t>(j.m_value.binary->subtype()));
+                    }
+                    else if (j.m_value.binary->subtype() <= (std::numeric_limits<std::uint16_t>::max)())
+                    {
+                        write_number(static_cast<std::uint8_t>(0xd9));
+                        write_number(static_cast<std::uint16_t>(j.m_value.binary->subtype()));
+                    }
+                    else if (j.m_value.binary->subtype() <= (std::numeric_limits<std::uint32_t>::max)())
+                    {
+                        write_number(static_cast<std::uint8_t>(0xda));
+                        write_number(static_cast<std::uint32_t>(j.m_value.binary->subtype()));
+                    }
+                    else if (j.m_value.binary->subtype() <= (std::numeric_limits<std::uint64_t>::max)())
+                    {
+                        write_number(static_cast<std::uint8_t>(0xdb));
+                        write_number(static_cast<std::uint64_t>(j.m_value.binary->subtype()));
+                    }
+                }
+
+                // step 1: write control byte and the binary array size
+                const auto N = j.m_value.binary->size();
+                if (N <= 0x17)
+                {
+                    write_number(static_cast<std::uint8_t>(0x40 + N));
+                }
+                else if (N <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x58));
+                    write_number(static_cast<std::uint8_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x59));
+                    write_number(static_cast<std::uint16_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x5A));
+                    write_number(static_cast<std::uint32_t>(N));
+                }
+                // LCOV_EXCL_START
+                else if (N <= (std::numeric_limits<std::uint64_t>::max)())
+                {
+                    oa->write_character(to_char_type(0x5B));
+                    write_number(static_cast<std::uint64_t>(N));
+                }
+                // LCOV_EXCL_STOP
+
+                // step 2: write each element
+                oa->write_characters(
+                    reinterpret_cast<const CharType*>(j.m_value.binary->data()),
+                    N);
+
+                break;
+            }
+
+            case value_t::object:
+            {
+                // step 1: write control byte and the object size
+                const auto N = j.m_value.object->size();
+                if (N <= 0x17)
+                {
+                    write_number(static_cast<std::uint8_t>(0xA0 + N));
+                }
+                else if (N <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    oa->write_character(to_char_type(0xB8));
+                    write_number(static_cast<std::uint8_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    oa->write_character(to_char_type(0xB9));
+                    write_number(static_cast<std::uint16_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    oa->write_character(to_char_type(0xBA));
+                    write_number(static_cast<std::uint32_t>(N));
+                }
+                // LCOV_EXCL_START
+                else if (N <= (std::numeric_limits<std::uint64_t>::max)())
+                {
+                    oa->write_character(to_char_type(0xBB));
+                    write_number(static_cast<std::uint64_t>(N));
+                }
+                // LCOV_EXCL_STOP
+
+                // step 2: write each element
+                for (const auto& el : *j.m_value.object)
+                {
+                    write_cbor(el.first);
+                    write_cbor(el.second);
+                }
+                break;
+            }
+
+            case value_t::discarded:
+            default:
+                break;
+        }
+    }
+
+    /*!
+    @param[in] j  JSON value to serialize
+    */
+    void write_msgpack(const BasicJsonType& j)
+    {
+        switch (j.type())
+        {
+            case value_t::null: // nil
+            {
+                oa->write_character(to_char_type(0xC0));
+                break;
+            }
+
+            case value_t::boolean: // true and false
+            {
+                oa->write_character(j.m_value.boolean
+                                    ? to_char_type(0xC3)
+                                    : to_char_type(0xC2));
+                break;
+            }
+
+            case value_t::number_integer:
+            {
+                if (j.m_value.number_integer >= 0)
+                {
+                    // MessagePack does not differentiate between positive
+                    // signed integers and unsigned integers. Therefore, we used
+                    // the code from the value_t::number_unsigned case here.
+                    if (j.m_value.number_unsigned < 128)
+                    {
+                        // positive fixnum
+                        write_number(static_cast<std::uint8_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint8_t>::max)())
+                    {
+                        // uint 8
+                        oa->write_character(to_char_type(0xCC));
+                        write_number(static_cast<std::uint8_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint16_t>::max)())
+                    {
+                        // uint 16
+                        oa->write_character(to_char_type(0xCD));
+                        write_number(static_cast<std::uint16_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint32_t>::max)())
+                    {
+                        // uint 32
+                        oa->write_character(to_char_type(0xCE));
+                        write_number(static_cast<std::uint32_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint64_t>::max)())
+                    {
+                        // uint 64
+                        oa->write_character(to_char_type(0xCF));
+                        write_number(static_cast<std::uint64_t>(j.m_value.number_integer));
+                    }
+                }
+                else
+                {
+                    if (j.m_value.number_integer >= -32)
+                    {
+                        // negative fixnum
+                        write_number(static_cast<std::int8_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_integer >= (std::numeric_limits<std::int8_t>::min)() &&
+                             j.m_value.number_integer <= (std::numeric_limits<std::int8_t>::max)())
+                    {
+                        // int 8
+                        oa->write_character(to_char_type(0xD0));
+                        write_number(static_cast<std::int8_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_integer >= (std::numeric_limits<std::int16_t>::min)() &&
+                             j.m_value.number_integer <= (std::numeric_limits<std::int16_t>::max)())
+                    {
+                        // int 16
+                        oa->write_character(to_char_type(0xD1));
+                        write_number(static_cast<std::int16_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_integer >= (std::numeric_limits<std::int32_t>::min)() &&
+                             j.m_value.number_integer <= (std::numeric_limits<std::int32_t>::max)())
+                    {
+                        // int 32
+                        oa->write_character(to_char_type(0xD2));
+                        write_number(static_cast<std::int32_t>(j.m_value.number_integer));
+                    }
+                    else if (j.m_value.number_integer >= (std::numeric_limits<std::int64_t>::min)() &&
+                             j.m_value.number_integer <= (std::numeric_limits<std::int64_t>::max)())
+                    {
+                        // int 64
+                        oa->write_character(to_char_type(0xD3));
+                        write_number(static_cast<std::int64_t>(j.m_value.number_integer));
+                    }
+                }
+                break;
+            }
+
+            case value_t::number_unsigned:
+            {
+                if (j.m_value.number_unsigned < 128)
+                {
+                    // positive fixnum
+                    write_number(static_cast<std::uint8_t>(j.m_value.number_integer));
+                }
+                else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    // uint 8
+                    oa->write_character(to_char_type(0xCC));
+                    write_number(static_cast<std::uint8_t>(j.m_value.number_integer));
+                }
+                else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    // uint 16
+                    oa->write_character(to_char_type(0xCD));
+                    write_number(static_cast<std::uint16_t>(j.m_value.number_integer));
+                }
+                else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    // uint 32
+                    oa->write_character(to_char_type(0xCE));
+                    write_number(static_cast<std::uint32_t>(j.m_value.number_integer));
+                }
+                else if (j.m_value.number_unsigned <= (std::numeric_limits<std::uint64_t>::max)())
+                {
+                    // uint 64
+                    oa->write_character(to_char_type(0xCF));
+                    write_number(static_cast<std::uint64_t>(j.m_value.number_integer));
+                }
+                break;
+            }
+
+            case value_t::number_float:
+            {
+                write_compact_float(j.m_value.number_float, detail::input_format_t::msgpack);
+                break;
+            }
+
+            case value_t::string:
+            {
+                // step 1: write control byte and the string length
+                const auto N = j.m_value.string->size();
+                if (N <= 31)
+                {
+                    // fixstr
+                    write_number(static_cast<std::uint8_t>(0xA0 | N));
+                }
+                else if (N <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    // str 8
+                    oa->write_character(to_char_type(0xD9));
+                    write_number(static_cast<std::uint8_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    // str 16
+                    oa->write_character(to_char_type(0xDA));
+                    write_number(static_cast<std::uint16_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    // str 32
+                    oa->write_character(to_char_type(0xDB));
+                    write_number(static_cast<std::uint32_t>(N));
+                }
+
+                // step 2: write the string
+                oa->write_characters(
+                    reinterpret_cast<const CharType*>(j.m_value.string->c_str()),
+                    j.m_value.string->size());
+                break;
+            }
+
+            case value_t::array:
+            {
+                // step 1: write control byte and the array size
+                const auto N = j.m_value.array->size();
+                if (N <= 15)
+                {
+                    // fixarray
+                    write_number(static_cast<std::uint8_t>(0x90 | N));
+                }
+                else if (N <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    // array 16
+                    oa->write_character(to_char_type(0xDC));
+                    write_number(static_cast<std::uint16_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    // array 32
+                    oa->write_character(to_char_type(0xDD));
+                    write_number(static_cast<std::uint32_t>(N));
+                }
+
+                // step 2: write each element
+                for (const auto& el : *j.m_value.array)
+                {
+                    write_msgpack(el);
+                }
+                break;
+            }
+
+            case value_t::binary:
+            {
+                // step 0: determine if the binary type has a set subtype to
+                // determine whether or not to use the ext or fixext types
+                const bool use_ext = j.m_value.binary->has_subtype();
+
+                // step 1: write control byte and the byte string length
+                const auto N = j.m_value.binary->size();
+                if (N <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    std::uint8_t output_type{};
+                    bool fixed = true;
+                    if (use_ext)
+                    {
+                        switch (N)
+                        {
+                            case 1:
+                                output_type = 0xD4; // fixext 1
+                                break;
+                            case 2:
+                                output_type = 0xD5; // fixext 2
+                                break;
+                            case 4:
+                                output_type = 0xD6; // fixext 4
+                                break;
+                            case 8:
+                                output_type = 0xD7; // fixext 8
+                                break;
+                            case 16:
+                                output_type = 0xD8; // fixext 16
+                                break;
+                            default:
+                                output_type = 0xC7; // ext 8
+                                fixed = false;
+                                break;
+                        }
+
+                    }
+                    else
+                    {
+                        output_type = 0xC4; // bin 8
+                        fixed = false;
+                    }
+
+                    oa->write_character(to_char_type(output_type));
+                    if (!fixed)
+                    {
+                        write_number(static_cast<std::uint8_t>(N));
+                    }
+                }
+                else if (N <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    std::uint8_t output_type = use_ext
+                                               ? 0xC8 // ext 16
+                                               : 0xC5; // bin 16
+
+                    oa->write_character(to_char_type(output_type));
+                    write_number(static_cast<std::uint16_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    std::uint8_t output_type = use_ext
+                                               ? 0xC9 // ext 32
+                                               : 0xC6; // bin 32
+
+                    oa->write_character(to_char_type(output_type));
+                    write_number(static_cast<std::uint32_t>(N));
+                }
+
+                // step 1.5: if this is an ext type, write the subtype
+                if (use_ext)
+                {
+                    write_number(static_cast<std::int8_t>(j.m_value.binary->subtype()));
+                }
+
+                // step 2: write the byte string
+                oa->write_characters(
+                    reinterpret_cast<const CharType*>(j.m_value.binary->data()),
+                    N);
+
+                break;
+            }
+
+            case value_t::object:
+            {
+                // step 1: write control byte and the object size
+                const auto N = j.m_value.object->size();
+                if (N <= 15)
+                {
+                    // fixmap
+                    write_number(static_cast<std::uint8_t>(0x80 | (N & 0xF)));
+                }
+                else if (N <= (std::numeric_limits<std::uint16_t>::max)())
+                {
+                    // map 16
+                    oa->write_character(to_char_type(0xDE));
+                    write_number(static_cast<std::uint16_t>(N));
+                }
+                else if (N <= (std::numeric_limits<std::uint32_t>::max)())
+                {
+                    // map 32
+                    oa->write_character(to_char_type(0xDF));
+                    write_number(static_cast<std::uint32_t>(N));
+                }
+
+                // step 2: write each element
+                for (const auto& el : *j.m_value.object)
+                {
+                    write_msgpack(el.first);
+                    write_msgpack(el.second);
+                }
+                break;
+            }
+
+            case value_t::discarded:
+            default:
+                break;
+        }
+    }
+
+    /*!
+    @param[in] j  JSON value to serialize
+    @param[in] use_count   whether to use '#' prefixes (optimized format)
+    @param[in] use_type    whether to use '$' prefixes (optimized format)
+    @param[in] add_prefix  whether prefixes need to be used for this value
+    */
+    void write_ubjson(const BasicJsonType& j, const bool use_count,
+                      const bool use_type, const bool add_prefix = true)
+    {
+        switch (j.type())
+        {
+            case value_t::null:
+            {
+                if (add_prefix)
+                {
+                    oa->write_character(to_char_type('Z'));
+                }
+                break;
+            }
+
+            case value_t::boolean:
+            {
+                if (add_prefix)
+                {
+                    oa->write_character(j.m_value.boolean
+                                        ? to_char_type('T')
+                                        : to_char_type('F'));
+                }
+                break;
+            }
+
+            case value_t::number_integer:
+            {
+                write_number_with_ubjson_prefix(j.m_value.number_integer, add_prefix);
+                break;
+            }
+
+            case value_t::number_unsigned:
+            {
+                write_number_with_ubjson_prefix(j.m_value.number_unsigned, add_prefix);
+                break;
+            }
+
+            case value_t::number_float:
+            {
+                write_number_with_ubjson_prefix(j.m_value.number_float, add_prefix);
+                break;
+            }
+
+            case value_t::string:
+            {
+                if (add_prefix)
+                {
+                    oa->write_character(to_char_type('S'));
+                }
+                write_number_with_ubjson_prefix(j.m_value.string->size(), true);
+                oa->write_characters(
+                    reinterpret_cast<const CharType*>(j.m_value.string->c_str()),
+                    j.m_value.string->size());
+                break;
+            }
+
+            case value_t::array:
+            {
+                if (add_prefix)
+                {
+                    oa->write_character(to_char_type('['));
+                }
+
+                bool prefix_required = true;
+                if (use_type && !j.m_value.array->empty())
+                {
+                    JSON_ASSERT(use_count);
+                    const CharType first_prefix = ubjson_prefix(j.front());
+                    const bool same_prefix = std::all_of(j.begin() + 1, j.end(),
+                                                         [this, first_prefix](const BasicJsonType & v)
+                    {
+                        return ubjson_prefix(v) == first_prefix;
+                    });
+
+                    if (same_prefix)
+                    {
+                        prefix_required = false;
+                        oa->write_character(to_char_type('$'));
+                        oa->write_character(first_prefix);
+                    }
+                }
+
+                if (use_count)
+                {
+                    oa->write_character(to_char_type('#'));
+                    write_number_with_ubjson_prefix(j.m_value.array->size(), true);
+                }
+
+                for (const auto& el : *j.m_value.array)
+                {
+                    write_ubjson(el, use_count, use_type, prefix_required);
+                }
+
+                if (!use_count)
+                {
+                    oa->write_character(to_char_type(']'));
+                }
+
+                break;
+            }
+
+            case value_t::binary:
+            {
+                if (add_prefix)
+                {
+                    oa->write_character(to_char_type('['));
+                }
+
+                if (use_type && !j.m_value.binary->empty())
+                {
+                    JSON_ASSERT(use_count);
+                    oa->write_character(to_char_type('$'));
+                    oa->write_character('U');
+                }
+
+                if (use_count)
+                {
+                    oa->write_character(to_char_type('#'));
+                    write_number_with_ubjson_prefix(j.m_value.binary->size(), true);
+                }
+
+                if (use_type)
+                {
+                    oa->write_characters(
+                        reinterpret_cast<const CharType*>(j.m_value.binary->data()),
+                        j.m_value.binary->size());
+                }
+                else
+                {
+                    for (size_t i = 0; i < j.m_value.binary->size(); ++i)
+                    {
+                        oa->write_character(to_char_type('U'));
+                        oa->write_character(j.m_value.binary->data()[i]);
+                    }
+                }
+
+                if (!use_count)
+                {
+                    oa->write_character(to_char_type(']'));
+                }
+
+                break;
+            }
+
+            case value_t::object:
+            {
+                if (add_prefix)
+                {
+                    oa->write_character(to_char_type('{'));
+                }
+
+                bool prefix_required = true;
+                if (use_type && !j.m_value.object->empty())
+                {
+                    JSON_ASSERT(use_count);
+                    const CharType first_prefix = ubjson_prefix(j.front());
+                    const bool same_prefix = std::all_of(j.begin(), j.end(),
+                                                         [this, first_prefix](const BasicJsonType & v)
+                    {
+                        return ubjson_prefix(v) == first_prefix;
+                    });
+
+                    if (same_prefix)
+                    {
+                        prefix_required = false;
+                        oa->write_character(to_char_type('$'));
+                        oa->write_character(first_prefix);
+                    }
+                }
+
+                if (use_count)
+                {
+                    oa->write_character(to_char_type('#'));
+                    write_number_with_ubjson_prefix(j.m_value.object->size(), true);
+                }
+
+                for (const auto& el : *j.m_value.object)
+                {
+                    write_number_with_ubjson_prefix(el.first.size(), true);
+                    oa->write_characters(
+                        reinterpret_cast<const CharType*>(el.first.c_str()),
+                        el.first.size());
+                    write_ubjson(el.second, use_count, use_type, prefix_required);
+                }
+
+                if (!use_count)
+                {
+                    oa->write_character(to_char_type('}'));
+                }
+
+                break;
+            }
+
+            case value_t::discarded:
+            default:
+                break;
+        }
+    }
+
+  private:
+    //////////
+    // BSON //
+    //////////
+
+    /*!
+    @return The size of a BSON document entry header, including the id marker
+            and the entry name size (and its null-terminator).
+    */
+    static std::size_t calc_bson_entry_header_size(const string_t& name, const BasicJsonType& j)
+    {
+        const auto it = name.find(static_cast<typename string_t::value_type>(0));
+        if (JSON_HEDLEY_UNLIKELY(it != BasicJsonType::string_t::npos))
+        {
+            JSON_THROW(out_of_range::create(409, "BSON key cannot contain code point U+0000 (at byte " + std::to_string(it) + ")", j));
+            static_cast<void>(j);
+        }
+
+        return /*id*/ 1ul + name.size() + /*zero-terminator*/1u;
+    }
+
+    /*!
+    @brief Writes the given @a element_type and @a name to the output adapter
+    */
+    void write_bson_entry_header(const string_t& name,
+                                 const std::uint8_t element_type)
+    {
+        oa->write_character(to_char_type(element_type)); // boolean
+        oa->write_characters(
+            reinterpret_cast<const CharType*>(name.c_str()),
+            name.size() + 1u);
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and boolean value @a value
+    */
+    void write_bson_boolean(const string_t& name,
+                            const bool value)
+    {
+        write_bson_entry_header(name, 0x08);
+        oa->write_character(value ? to_char_type(0x01) : to_char_type(0x00));
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and double value @a value
+    */
+    void write_bson_double(const string_t& name,
+                           const double value)
+    {
+        write_bson_entry_header(name, 0x01);
+        write_number<double, true>(value);
+    }
+
+    /*!
+    @return The size of the BSON-encoded string in @a value
+    */
+    static std::size_t calc_bson_string_size(const string_t& value)
+    {
+        return sizeof(std::int32_t) + value.size() + 1ul;
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and string value @a value
+    */
+    void write_bson_string(const string_t& name,
+                           const string_t& value)
+    {
+        write_bson_entry_header(name, 0x02);
+
+        write_number<std::int32_t, true>(static_cast<std::int32_t>(value.size() + 1ul));
+        oa->write_characters(
+            reinterpret_cast<const CharType*>(value.c_str()),
+            value.size() + 1);
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and null value
+    */
+    void write_bson_null(const string_t& name)
+    {
+        write_bson_entry_header(name, 0x0A);
+    }
+
+    /*!
+    @return The size of the BSON-encoded integer @a value
+    */
+    static std::size_t calc_bson_integer_size(const std::int64_t value)
+    {
+        return (std::numeric_limits<std::int32_t>::min)() <= value && value <= (std::numeric_limits<std::int32_t>::max)()
+               ? sizeof(std::int32_t)
+               : sizeof(std::int64_t);
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and integer @a value
+    */
+    void write_bson_integer(const string_t& name,
+                            const std::int64_t value)
+    {
+        if ((std::numeric_limits<std::int32_t>::min)() <= value && value <= (std::numeric_limits<std::int32_t>::max)())
+        {
+            write_bson_entry_header(name, 0x10); // int32
+            write_number<std::int32_t, true>(static_cast<std::int32_t>(value));
+        }
+        else
+        {
+            write_bson_entry_header(name, 0x12); // int64
+            write_number<std::int64_t, true>(static_cast<std::int64_t>(value));
+        }
+    }
+
+    /*!
+    @return The size of the BSON-encoded unsigned integer in @a j
+    */
+    static constexpr std::size_t calc_bson_unsigned_size(const std::uint64_t value) noexcept
+    {
+        return (value <= static_cast<std::uint64_t>((std::numeric_limits<std::int32_t>::max)()))
+               ? sizeof(std::int32_t)
+               : sizeof(std::int64_t);
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and unsigned @a value
+    */
+    void write_bson_unsigned(const string_t& name,
+                             const BasicJsonType& j)
+    {
+        if (j.m_value.number_unsigned <= static_cast<std::uint64_t>((std::numeric_limits<std::int32_t>::max)()))
+        {
+            write_bson_entry_header(name, 0x10 /* int32 */);
+            write_number<std::int32_t, true>(static_cast<std::int32_t>(j.m_value.number_unsigned));
+        }
+        else if (j.m_value.number_unsigned <= static_cast<std::uint64_t>((std::numeric_limits<std::int64_t>::max)()))
+        {
+            write_bson_entry_header(name, 0x12 /* int64 */);
+            write_number<std::int64_t, true>(static_cast<std::int64_t>(j.m_value.number_unsigned));
+        }
+        else
+        {
+            JSON_THROW(out_of_range::create(407, "integer number " + std::to_string(j.m_value.number_unsigned) + " cannot be represented by BSON as it does not fit int64", j));
+        }
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and object @a value
+    */
+    void write_bson_object_entry(const string_t& name,
+                                 const typename BasicJsonType::object_t& value)
+    {
+        write_bson_entry_header(name, 0x03); // object
+        write_bson_object(value);
+    }
+
+    /*!
+    @return The size of the BSON-encoded array @a value
+    */
+    static std::size_t calc_bson_array_size(const typename BasicJsonType::array_t& value)
+    {
+        std::size_t array_index = 0ul;
+
+        const std::size_t embedded_document_size = std::accumulate(std::begin(value), std::end(value), static_cast<std::size_t>(0), [&array_index](std::size_t result, const typename BasicJsonType::array_t::value_type & el)
+        {
+            return result + calc_bson_element_size(std::to_string(array_index++), el);
+        });
+
+        return sizeof(std::int32_t) + embedded_document_size + 1ul;
+    }
+
+    /*!
+    @return The size of the BSON-encoded binary array @a value
+    */
+    static std::size_t calc_bson_binary_size(const typename BasicJsonType::binary_t& value)
+    {
+        return sizeof(std::int32_t) + value.size() + 1ul;
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and array @a value
+    */
+    void write_bson_array(const string_t& name,
+                          const typename BasicJsonType::array_t& value)
+    {
+        write_bson_entry_header(name, 0x04); // array
+        write_number<std::int32_t, true>(static_cast<std::int32_t>(calc_bson_array_size(value)));
+
+        std::size_t array_index = 0ul;
+
+        for (const auto& el : value)
+        {
+            write_bson_element(std::to_string(array_index++), el);
+        }
+
+        oa->write_character(to_char_type(0x00));
+    }
+
+    /*!
+    @brief Writes a BSON element with key @a name and binary value @a value
+    */
+    void write_bson_binary(const string_t& name,
+                           const binary_t& value)
+    {
+        write_bson_entry_header(name, 0x05);
+
+        write_number<std::int32_t, true>(static_cast<std::int32_t>(value.size()));
+        write_number(value.has_subtype() ? static_cast<std::uint8_t>(value.subtype()) : static_cast<std::uint8_t>(0x00));
+
+        oa->write_characters(reinterpret_cast<const CharType*>(value.data()), value.size());
+    }
+
+    /*!
+    @brief Calculates the size necessary to serialize the JSON value @a j with its @a name
+    @return The calculated size for the BSON document entry for @a j with the given @a name.
+    */
+    static std::size_t calc_bson_element_size(const string_t& name,
+            const BasicJsonType& j)
+    {
+        const auto header_size = calc_bson_entry_header_size(name, j);
+        switch (j.type())
+        {
+            case value_t::object:
+                return header_size + calc_bson_object_size(*j.m_value.object);
+
+            case value_t::array:
+                return header_size + calc_bson_array_size(*j.m_value.array);
+
+            case value_t::binary:
+                return header_size + calc_bson_binary_size(*j.m_value.binary);
+
+            case value_t::boolean:
+                return header_size + 1ul;
+
+            case value_t::number_float:
+                return header_size + 8ul;
+
+            case value_t::number_integer:
+                return header_size + calc_bson_integer_size(j.m_value.number_integer);
+
+            case value_t::number_unsigned:
+                return header_size + calc_bson_unsigned_size(j.m_value.number_unsigned);
+
+            case value_t::string:
+                return header_size + calc_bson_string_size(*j.m_value.string);
+
+            case value_t::null:
+                return header_size + 0ul;
+
+            // LCOV_EXCL_START
+            case value_t::discarded:
+            default:
+                JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert)
+                return 0ul;
+                // LCOV_EXCL_STOP
+        }
+    }
+
+    /*!
+    @brief Serializes the JSON value @a j to BSON and associates it with the
+           key @a name.
+    @param name The name to associate with the JSON entity @a j within the
+                current BSON document
+    */
+    void write_bson_element(const string_t& name,
+                            const BasicJsonType& j)
+    {
+        switch (j.type())
+        {
+            case value_t::object:
+                return write_bson_object_entry(name, *j.m_value.object);
+
+            case value_t::array:
+                return write_bson_array(name, *j.m_value.array);
+
+            case value_t::binary:
+                return write_bson_binary(name, *j.m_value.binary);
+
+            case value_t::boolean:
+                return write_bson_boolean(name, j.m_value.boolean);
+
+            case value_t::number_float:
+                return write_bson_double(name, j.m_value.number_float);
+
+            case value_t::number_integer:
+                return write_bson_integer(name, j.m_value.number_integer);
+
+            case value_t::number_unsigned:
+                return write_bson_unsigned(name, j);
+
+            case value_t::string:
+                return write_bson_string(name, *j.m_value.string);
+
+            case value_t::null:
+                return write_bson_null(name);
+
+            // LCOV_EXCL_START
+            case value_t::discarded:
+            default:
+                JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert)
+                return;
+                // LCOV_EXCL_STOP
+        }
+    }
+
+    /*!
+    @brief Calculates the size of the BSON serialization of the given
+           JSON-object @a j.
+    @param[in] value  JSON value to serialize
+    @pre       value.type() == value_t::object
+    */
+    static std::size_t calc_bson_object_size(const typename BasicJsonType::object_t& value)
+    {
+        std::size_t document_size = std::accumulate(value.begin(), value.end(), static_cast<std::size_t>(0),
+                                    [](size_t result, const typename BasicJsonType::object_t::value_type & el)
+        {
+            return result += calc_bson_element_size(el.first, el.second);
+        });
+
+        return sizeof(std::int32_t) + document_size + 1ul;
+    }
+
+    /*!
+    @param[in] value  JSON value to serialize
+    @pre       value.type() == value_t::object
+    */
+    void write_bson_object(const typename BasicJsonType::object_t& value)
+    {
+        write_number<std::int32_t, true>(static_cast<std::int32_t>(calc_bson_object_size(value)));
+
+        for (const auto& el : value)
+        {
+            write_bson_element(el.first, el.second);
+        }
+
+        oa->write_character(to_char_type(0x00));
+    }
+
+    //////////
+    // CBOR //
+    //////////
+
+    static constexpr CharType get_cbor_float_prefix(float /*unused*/)
+    {
+        return to_char_type(0xFA);  // Single-Precision Float
+    }
+
+    static constexpr CharType get_cbor_float_prefix(double /*unused*/)
+    {
+        return to_char_type(0xFB);  // Double-Precision Float
+    }
+
+    /////////////
+    // MsgPack //
+    /////////////
+
+    static constexpr CharType get_msgpack_float_prefix(float /*unused*/)
+    {
+        return to_char_type(0xCA);  // float 32
+    }
+
+    static constexpr CharType get_msgpack_float_prefix(double /*unused*/)
+    {
+        return to_char_type(0xCB);  // float 64
+    }
+
+    ////////////
+    // UBJSON //
+    ////////////
+
+    // UBJSON: write number (floating point)
+    template<typename NumberType, typename std::enable_if<
+                 std::is_floating_point<NumberType>::value, int>::type = 0>
+    void write_number_with_ubjson_prefix(const NumberType n,
+                                         const bool add_prefix)
+    {
+        if (add_prefix)
+        {
+            oa->write_character(get_ubjson_float_prefix(n));
+        }
+        write_number(n);
+    }
+
+    // UBJSON: write number (unsigned integer)
+    template<typename NumberType, typename std::enable_if<
+                 std::is_unsigned<NumberType>::value, int>::type = 0>
+    void write_number_with_ubjson_prefix(const NumberType n,
+                                         const bool add_prefix)
+    {
+        if (n <= static_cast<std::uint64_t>((std::numeric_limits<std::int8_t>::max)()))
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('i'));  // int8
+            }
+            write_number(static_cast<std::uint8_t>(n));
+        }
+        else if (n <= (std::numeric_limits<std::uint8_t>::max)())
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('U'));  // uint8
+            }
+            write_number(static_cast<std::uint8_t>(n));
+        }
+        else if (n <= static_cast<std::uint64_t>((std::numeric_limits<std::int16_t>::max)()))
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('I'));  // int16
+            }
+            write_number(static_cast<std::int16_t>(n));
+        }
+        else if (n <= static_cast<std::uint64_t>((std::numeric_limits<std::int32_t>::max)()))
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('l'));  // int32
+            }
+            write_number(static_cast<std::int32_t>(n));
+        }
+        else if (n <= static_cast<std::uint64_t>((std::numeric_limits<std::int64_t>::max)()))
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('L'));  // int64
+            }
+            write_number(static_cast<std::int64_t>(n));
+        }
+        else
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('H'));  // high-precision number
+            }
+
+            const auto number = BasicJsonType(n).dump();
+            write_number_with_ubjson_prefix(number.size(), true);
+            for (std::size_t i = 0; i < number.size(); ++i)
+            {
+                oa->write_character(to_char_type(static_cast<std::uint8_t>(number[i])));
+            }
+        }
+    }
+
+    // UBJSON: write number (signed integer)
+    template < typename NumberType, typename std::enable_if <
+                   std::is_signed<NumberType>::value&&
+                   !std::is_floating_point<NumberType>::value, int >::type = 0 >
+    void write_number_with_ubjson_prefix(const NumberType n,
+                                         const bool add_prefix)
+    {
+        if ((std::numeric_limits<std::int8_t>::min)() <= n && n <= (std::numeric_limits<std::int8_t>::max)())
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('i'));  // int8
+            }
+            write_number(static_cast<std::int8_t>(n));
+        }
+        else if (static_cast<std::int64_t>((std::numeric_limits<std::uint8_t>::min)()) <= n && n <= static_cast<std::int64_t>((std::numeric_limits<std::uint8_t>::max)()))
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('U'));  // uint8
+            }
+            write_number(static_cast<std::uint8_t>(n));
+        }
+        else if ((std::numeric_limits<std::int16_t>::min)() <= n && n <= (std::numeric_limits<std::int16_t>::max)())
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('I'));  // int16
+            }
+            write_number(static_cast<std::int16_t>(n));
+        }
+        else if ((std::numeric_limits<std::int32_t>::min)() <= n && n <= (std::numeric_limits<std::int32_t>::max)())
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('l'));  // int32
+            }
+            write_number(static_cast<std::int32_t>(n));
+        }
+        else if ((std::numeric_limits<std::int64_t>::min)() <= n && n <= (std::numeric_limits<std::int64_t>::max)())
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('L'));  // int64
+            }
+            write_number(static_cast<std::int64_t>(n));
+        }
+        // LCOV_EXCL_START
+        else
+        {
+            if (add_prefix)
+            {
+                oa->write_character(to_char_type('H'));  // high-precision number
+            }
+
+            const auto number = BasicJsonType(n).dump();
+            write_number_with_ubjson_prefix(number.size(), true);
+            for (std::size_t i = 0; i < number.size(); ++i)
+            {
+                oa->write_character(to_char_type(static_cast<std::uint8_t>(number[i])));
+            }
+        }
+        // LCOV_EXCL_STOP
+    }
+
+    /*!
+    @brief determine the type prefix of container values
+    */
+    CharType ubjson_prefix(const BasicJsonType& j) const noexcept
+    {
+        switch (j.type())
+        {
+            case value_t::null:
+                return 'Z';
+
+            case value_t::boolean:
+                return j.m_value.boolean ? 'T' : 'F';
+
+            case value_t::number_integer:
+            {
+                if ((std::numeric_limits<std::int8_t>::min)() <= j.m_value.number_integer && j.m_value.number_integer <= (std::numeric_limits<std::int8_t>::max)())
+                {
+                    return 'i';
+                }
+                if ((std::numeric_limits<std::uint8_t>::min)() <= j.m_value.number_integer && j.m_value.number_integer <= (std::numeric_limits<std::uint8_t>::max)())
+                {
+                    return 'U';
+                }
+                if ((std::numeric_limits<std::int16_t>::min)() <= j.m_value.number_integer && j.m_value.number_integer <= (std::numeric_limits<std::int16_t>::max)())
+                {
+                    return 'I';
+                }
+                if ((std::numeric_limits<std::int32_t>::min)() <= j.m_value.number_integer && j.m_value.number_integer <= (std::numeric_limits<std::int32_t>::max)())
+                {
+                    return 'l';
+                }
+                if ((std::numeric_limits<std::int64_t>::min)() <= j.m_value.number_integer && j.m_value.number_integer <= (std::numeric_limits<std::int64_t>::max)())
+                {
+                    return 'L';
+                }
+                // anything else is treated as high-precision number
+                return 'H'; // LCOV_EXCL_LINE
+            }
+
+            case value_t::number_unsigned:
+            {
+                if (j.m_value.number_unsigned <= static_cast<std::uint64_t>((std::numeric_limits<std::int8_t>::max)()))
+                {
+                    return 'i';
+                }
+                if (j.m_value.number_unsigned <= static_cast<std::uint64_t>((std::numeric_limits<std::uint8_t>::max)()))
+                {
+                    return 'U';
+                }
+                if (j.m_value.number_unsigned <= static_cast<std::uint64_t>((std::numeric_limits<std::int16_t>::max)()))
+                {
+                    return 'I';
+                }
+                if (j.m_value.number_unsigned <= static_cast<std::uint64_t>((std::numeric_limits<std::int32_t>::max)()))
+                {
+                    return 'l';
+                }
+                if (j.m_value.number_unsigned <= static_cast<std::uint64_t>((std::numeric_limits<std::int64_t>::max)()))
+                {
+                    return 'L';
+                }
+                // anything else is treated as high-precision number
+                return 'H'; // LCOV_EXCL_LINE
+            }
+
+            case value_t::number_float:
+                return get_ubjson_float_prefix(j.m_value.number_float);
+
+            case value_t::string:
+                return 'S';
+
+            case value_t::array: // fallthrough
+            case value_t::binary:
+                return '[';
+
+            case value_t::object:
+                return '{';
+
+            case value_t::discarded:
+            default:  // discarded values
+                return 'N';
+        }
+    }
+
+    static constexpr CharType get_ubjson_float_prefix(float /*unused*/)
+    {
+        return 'd';  // float 32
+    }
+
+    static constexpr CharType get_ubjson_float_prefix(double /*unused*/)
+    {
+        return 'D';  // float 64
+    }
+
+    ///////////////////////
+    // Utility functions //
+    ///////////////////////
+
+    /*
+    @brief write a number to output input
+    @param[in] n number of type @a NumberType
+    @tparam NumberType the type of the number
+    @tparam OutputIsLittleEndian Set to true if output data is
+                                 required to be little endian
+
+    @note This function needs to respect the system's endianness, because bytes
+          in CBOR, MessagePack, and UBJSON are stored in network order (big
+          endian) and therefore need reordering on little endian systems.
+    */
+    template<typename NumberType, bool OutputIsLittleEndian = false>
+    void write_number(const NumberType n)
+    {
+        // step 1: write number to array of length NumberType
+        std::array<CharType, sizeof(NumberType)> vec{};
+        std::memcpy(vec.data(), &n, sizeof(NumberType));
+
+        // step 2: write array to output (with possible reordering)
+        if (is_little_endian != OutputIsLittleEndian)
+        {
+            // reverse byte order prior to conversion if necessary
+            std::reverse(vec.begin(), vec.end());
+        }
+
+        oa->write_characters(vec.data(), sizeof(NumberType));
+    }
+
+    void write_compact_float(const number_float_t n, detail::input_format_t format)
+    {
+#ifdef __GNUC__
+#pragma GCC diagnostic push
+#pragma GCC diagnostic ignored "-Wfloat-equal"
+#endif
+        if (static_cast<double>(n) >= static_cast<double>(std::numeric_limits<float>::lowest()) &&
+                static_cast<double>(n) <= static_cast<double>((std::numeric_limits<float>::max)()) &&
+                static_cast<double>(static_cast<float>(n)) == static_cast<double>(n))
+        {
+            oa->write_character(format == detail::input_format_t::cbor
+                                ? get_cbor_float_prefix(static_cast<float>(n))
+                                : get_msgpack_float_prefix(static_cast<float>(n)));
+            write_number(static_cast<float>(n));
+        }
+        else
+        {
+            oa->write_character(format == detail::input_format_t::cbor
+                                ? get_cbor_float_prefix(n)
+                                : get_msgpack_float_prefix(n));
+            write_number(n);
+        }
+#ifdef __GNUC__
+#pragma GCC diagnostic pop
+#endif
+    }
+
+  public:
+    // The following to_char_type functions are implement the conversion
+    // between uint8_t and CharType. In case CharType is not unsigned,
+    // such a conversion is required to allow values greater than 128.
+    // See <https://github.com/nlohmann/json/issues/1286> for a discussion.
+    template < typename C = CharType,
+               enable_if_t < std::is_signed<C>::value && std::is_signed<char>::value > * = nullptr >
+    static constexpr CharType to_char_type(std::uint8_t x) noexcept
+    {
+        return *reinterpret_cast<char*>(&x);
+    }
+
+    template < typename C = CharType,
+               enable_if_t < std::is_signed<C>::value && std::is_unsigned<char>::value > * = nullptr >
+    static CharType to_char_type(std::uint8_t x) noexcept
+    {
+        static_assert(sizeof(std::uint8_t) == sizeof(CharType), "size of CharType must be equal to std::uint8_t");
+        static_assert(std::is_trivial<CharType>::value, "CharType must be trivial");
+        CharType result;
+        std::memcpy(&result, &x, sizeof(x));
+        return result;
+    }
+
+    template<typename C = CharType,
+             enable_if_t<std::is_unsigned<C>::value>* = nullptr>
+    static constexpr CharType to_char_type(std::uint8_t x) noexcept
+    {
+        return x;
+    }
+
+    template < typename InputCharType, typename C = CharType,
+               enable_if_t <
+                   std::is_signed<C>::value &&
+                   std::is_signed<char>::value &&
+                   std::is_same<char, typename std::remove_cv<InputCharType>::type>::value
+                   > * = nullptr >
+    static constexpr CharType to_char_type(InputCharType x) noexcept
+    {
+        return x;
+    }
+
+  private:
+    /// whether we can assume little endianness
+    const bool is_little_endian = little_endianness();
+
+    /// the output
+    output_adapter_t<CharType> oa = nullptr;
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/output/output_adapters.hpp>
+
+// #include <nlohmann/detail/output/serializer.hpp>
+
+
+#include <algorithm> // reverse, remove, fill, find, none_of
+#include <array> // array
+#include <clocale> // localeconv, lconv
+#include <cmath> // labs, isfinite, isnan, signbit
+#include <cstddef> // size_t, ptrdiff_t
+#include <cstdint> // uint8_t
+#include <cstdio> // snprintf
+#include <limits> // numeric_limits
+#include <string> // string, char_traits
+#include <iomanip> // setfill, setw
+#include <sstream> // stringstream
+#include <type_traits> // is_same
+#include <utility> // move
+
+// #include <nlohmann/detail/conversions/to_chars.hpp>
+
+
+#include <array> // array
+#include <cmath>   // signbit, isfinite
+#include <cstdint> // intN_t, uintN_t
+#include <cstring> // memcpy, memmove
+#include <limits> // numeric_limits
+#include <type_traits> // conditional
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+
+/*!
+@brief implements the Grisu2 algorithm for binary to decimal floating-point
+conversion.
+
+This implementation is a slightly modified version of the reference
+implementation which may be obtained from
+http://florian.loitsch.com/publications (bench.tar.gz).
+
+The code is distributed under the MIT license, Copyright (c) 2009 Florian Loitsch.
+
+For a detailed description of the algorithm see:
+
+[1] Loitsch, "Printing Floating-Point Numbers Quickly and Accurately with
+    Integers", Proceedings of the ACM SIGPLAN 2010 Conference on Programming
+    Language Design and Implementation, PLDI 2010
+[2] Burger, Dybvig, "Printing Floating-Point Numbers Quickly and Accurately",
+    Proceedings of the ACM SIGPLAN 1996 Conference on Programming Language
+    Design and Implementation, PLDI 1996
+*/
+namespace dtoa_impl
+{
+
+template<typename Target, typename Source>
+Target reinterpret_bits(const Source source)
+{
+    static_assert(sizeof(Target) == sizeof(Source), "size mismatch");
+
+    Target target;
+    std::memcpy(&target, &source, sizeof(Source));
+    return target;
+}
+
+struct diyfp // f * 2^e
+{
+    static constexpr int kPrecision = 64; // = q
+
+    std::uint64_t f = 0;
+    int e = 0;
+
+    constexpr diyfp(std::uint64_t f_, int e_) noexcept : f(f_), e(e_) {}
+
+    /*!
+    @brief returns x - y
+    @pre x.e == y.e and x.f >= y.f
+    */
+    static diyfp sub(const diyfp& x, const diyfp& y) noexcept
+    {
+        JSON_ASSERT(x.e == y.e);
+        JSON_ASSERT(x.f >= y.f);
+
+        return {x.f - y.f, x.e};
+    }
+
+    /*!
+    @brief returns x * y
+    @note The result is rounded. (Only the upper q bits are returned.)
+    */
+    static diyfp mul(const diyfp& x, const diyfp& y) noexcept
+    {
+        static_assert(kPrecision == 64, "internal error");
+
+        // Computes:
+        //  f = round((x.f * y.f) / 2^q)
+        //  e = x.e + y.e + q
+
+        // Emulate the 64-bit * 64-bit multiplication:
+        //
+        // p = u * v
+        //   = (u_lo + 2^32 u_hi) (v_lo + 2^32 v_hi)
+        //   = (u_lo v_lo         ) + 2^32 ((u_lo v_hi         ) + (u_hi v_lo         )) + 2^64 (u_hi v_hi         )
+        //   = (p0                ) + 2^32 ((p1                ) + (p2                )) + 2^64 (p3                )
+        //   = (p0_lo + 2^32 p0_hi) + 2^32 ((p1_lo + 2^32 p1_hi) + (p2_lo + 2^32 p2_hi)) + 2^64 (p3                )
+        //   = (p0_lo             ) + 2^32 (p0_hi + p1_lo + p2_lo                      ) + 2^64 (p1_hi + p2_hi + p3)
+        //   = (p0_lo             ) + 2^32 (Q                                          ) + 2^64 (H                 )
+        //   = (p0_lo             ) + 2^32 (Q_lo + 2^32 Q_hi                           ) + 2^64 (H                 )
+        //
+        // (Since Q might be larger than 2^32 - 1)
+        //
+        //   = (p0_lo + 2^32 Q_lo) + 2^64 (Q_hi + H)
+        //
+        // (Q_hi + H does not overflow a 64-bit int)
+        //
+        //   = p_lo + 2^64 p_hi
+
+        const std::uint64_t u_lo = x.f & 0xFFFFFFFFu;
+        const std::uint64_t u_hi = x.f >> 32u;
+        const std::uint64_t v_lo = y.f & 0xFFFFFFFFu;
+        const std::uint64_t v_hi = y.f >> 32u;
+
+        const std::uint64_t p0 = u_lo * v_lo;
+        const std::uint64_t p1 = u_lo * v_hi;
+        const std::uint64_t p2 = u_hi * v_lo;
+        const std::uint64_t p3 = u_hi * v_hi;
+
+        const std::uint64_t p0_hi = p0 >> 32u;
+        const std::uint64_t p1_lo = p1 & 0xFFFFFFFFu;
+        const std::uint64_t p1_hi = p1 >> 32u;
+        const std::uint64_t p2_lo = p2 & 0xFFFFFFFFu;
+        const std::uint64_t p2_hi = p2 >> 32u;
+
+        std::uint64_t Q = p0_hi + p1_lo + p2_lo;
+
+        // The full product might now be computed as
+        //
+        // p_hi = p3 + p2_hi + p1_hi + (Q >> 32)
+        // p_lo = p0_lo + (Q << 32)
+        //
+        // But in this particular case here, the full p_lo is not required.
+        // Effectively we only need to add the highest bit in p_lo to p_hi (and
+        // Q_hi + 1 does not overflow).
+
+        Q += std::uint64_t{1} << (64u - 32u - 1u); // round, ties up
+
+        const std::uint64_t h = p3 + p2_hi + p1_hi + (Q >> 32u);
+
+        return {h, x.e + y.e + 64};
+    }
+
+    /*!
+    @brief normalize x such that the significand is >= 2^(q-1)
+    @pre x.f != 0
+    */
+    static diyfp normalize(diyfp x) noexcept
+    {
+        JSON_ASSERT(x.f != 0);
+
+        while ((x.f >> 63u) == 0)
+        {
+            x.f <<= 1u;
+            x.e--;
+        }
+
+        return x;
+    }
+
+    /*!
+    @brief normalize x such that the result has the exponent E
+    @pre e >= x.e and the upper e - x.e bits of x.f must be zero.
+    */
+    static diyfp normalize_to(const diyfp& x, const int target_exponent) noexcept
+    {
+        const int delta = x.e - target_exponent;
+
+        JSON_ASSERT(delta >= 0);
+        JSON_ASSERT(((x.f << delta) >> delta) == x.f);
+
+        return {x.f << delta, target_exponent};
+    }
+};
+
+struct boundaries
+{
+    diyfp w;
+    diyfp minus;
+    diyfp plus;
+};
+
+/*!
+Compute the (normalized) diyfp representing the input number 'value' and its
+boundaries.
+
+@pre value must be finite and positive
+*/
+template<typename FloatType>
+boundaries compute_boundaries(FloatType value)
+{
+    JSON_ASSERT(std::isfinite(value));
+    JSON_ASSERT(value > 0);
+
+    // Convert the IEEE representation into a diyfp.
+    //
+    // If v is denormal:
+    //      value = 0.F * 2^(1 - bias) = (          F) * 2^(1 - bias - (p-1))
+    // If v is normalized:
+    //      value = 1.F * 2^(E - bias) = (2^(p-1) + F) * 2^(E - bias - (p-1))
+
+    static_assert(std::numeric_limits<FloatType>::is_iec559,
+                  "internal error: dtoa_short requires an IEEE-754 floating-point implementation");
+
+    constexpr int      kPrecision = std::numeric_limits<FloatType>::digits; // = p (includes the hidden bit)
+    constexpr int      kBias      = std::numeric_limits<FloatType>::max_exponent - 1 + (kPrecision - 1);
+    constexpr int      kMinExp    = 1 - kBias;
+    constexpr std::uint64_t kHiddenBit = std::uint64_t{1} << (kPrecision - 1); // = 2^(p-1)
+
+    using bits_type = typename std::conditional<kPrecision == 24, std::uint32_t, std::uint64_t >::type;
+
+    const auto bits = static_cast<std::uint64_t>(reinterpret_bits<bits_type>(value));
+    const std::uint64_t E = bits >> (kPrecision - 1);
+    const std::uint64_t F = bits & (kHiddenBit - 1);
+
+    const bool is_denormal = E == 0;
+    const diyfp v = is_denormal
+                    ? diyfp(F, kMinExp)
+                    : diyfp(F + kHiddenBit, static_cast<int>(E) - kBias);
+
+    // Compute the boundaries m- and m+ of the floating-point value
+    // v = f * 2^e.
+    //
+    // Determine v- and v+, the floating-point predecessor and successor if v,
+    // respectively.
+    //
+    //      v- = v - 2^e        if f != 2^(p-1) or e == e_min                (A)
+    //         = v - 2^(e-1)    if f == 2^(p-1) and e > e_min                (B)
+    //
+    //      v+ = v + 2^e
+    //
+    // Let m- = (v- + v) / 2 and m+ = (v + v+) / 2. All real numbers _strictly_
+    // between m- and m+ round to v, regardless of how the input rounding
+    // algorithm breaks ties.
+    //
+    //      ---+-------------+-------------+-------------+-------------+---  (A)
+    //         v-            m-            v             m+            v+
+    //
+    //      -----------------+------+------+-------------+-------------+---  (B)
+    //                       v-     m-     v             m+            v+
+
+    const bool lower_boundary_is_closer = F == 0 && E > 1;
+    const diyfp m_plus = diyfp(2 * v.f + 1, v.e - 1);
+    const diyfp m_minus = lower_boundary_is_closer
+                          ? diyfp(4 * v.f - 1, v.e - 2)  // (B)
+                          : diyfp(2 * v.f - 1, v.e - 1); // (A)
+
+    // Determine the normalized w+ = m+.
+    const diyfp w_plus = diyfp::normalize(m_plus);
+
+    // Determine w- = m- such that e_(w-) = e_(w+).
+    const diyfp w_minus = diyfp::normalize_to(m_minus, w_plus.e);
+
+    return {diyfp::normalize(v), w_minus, w_plus};
+}
+
+// Given normalized diyfp w, Grisu needs to find a (normalized) cached
+// power-of-ten c, such that the exponent of the product c * w = f * 2^e lies
+// within a certain range [alpha, gamma] (Definition 3.2 from [1])
+//
+//      alpha <= e = e_c + e_w + q <= gamma
+//
+// or
+//
+//      f_c * f_w * 2^alpha <= f_c 2^(e_c) * f_w 2^(e_w) * 2^q
+//                          <= f_c * f_w * 2^gamma
+//
+// Since c and w are normalized, i.e. 2^(q-1) <= f < 2^q, this implies
+//
+//      2^(q-1) * 2^(q-1) * 2^alpha <= c * w * 2^q < 2^q * 2^q * 2^gamma
+//
+// or
+//
+//      2^(q - 2 + alpha) <= c * w < 2^(q + gamma)
+//
+// The choice of (alpha,gamma) determines the size of the table and the form of
+// the digit generation procedure. Using (alpha,gamma)=(-60,-32) works out well
+// in practice:
+//
+// The idea is to cut the number c * w = f * 2^e into two parts, which can be
+// processed independently: An integral part p1, and a fractional part p2:
+//
+//      f * 2^e = ( (f div 2^-e) * 2^-e + (f mod 2^-e) ) * 2^e
+//              = (f div 2^-e) + (f mod 2^-e) * 2^e
+//              = p1 + p2 * 2^e
+//
+// The conversion of p1 into decimal form requires a series of divisions and
+// modulos by (a power of) 10. These operations are faster for 32-bit than for
+// 64-bit integers, so p1 should ideally fit into a 32-bit integer. This can be
+// achieved by choosing
+//
+//      -e >= 32   or   e <= -32 := gamma
+//
+// In order to convert the fractional part
+//
+//      p2 * 2^e = p2 / 2^-e = d[-1] / 10^1 + d[-2] / 10^2 + ...
+//
+// into decimal form, the fraction is repeatedly multiplied by 10 and the digits
+// d[-i] are extracted in order:
+//
+//      (10 * p2) div 2^-e = d[-1]
+//      (10 * p2) mod 2^-e = d[-2] / 10^1 + ...
+//
+// The multiplication by 10 must not overflow. It is sufficient to choose
+//
+//      10 * p2 < 16 * p2 = 2^4 * p2 <= 2^64.
+//
+// Since p2 = f mod 2^-e < 2^-e,
+//
+//      -e <= 60   or   e >= -60 := alpha
+
+constexpr int kAlpha = -60;
+constexpr int kGamma = -32;
+
+struct cached_power // c = f * 2^e ~= 10^k
+{
+    std::uint64_t f;
+    int e;
+    int k;
+};
+
+/*!
+For a normalized diyfp w = f * 2^e, this function returns a (normalized) cached
+power-of-ten c = f_c * 2^e_c, such that the exponent of the product w * c
+satisfies (Definition 3.2 from [1])
+
+     alpha <= e_c + e + q <= gamma.
+*/
+inline cached_power get_cached_power_for_binary_exponent(int e)
+{
+    // Now
+    //
+    //      alpha <= e_c + e + q <= gamma                                    (1)
+    //      ==> f_c * 2^alpha <= c * 2^e * 2^q
+    //
+    // and since the c's are normalized, 2^(q-1) <= f_c,
+    //
+    //      ==> 2^(q - 1 + alpha) <= c * 2^(e + q)
+    //      ==> 2^(alpha - e - 1) <= c
+    //
+    // If c were an exact power of ten, i.e. c = 10^k, one may determine k as
+    //
+    //      k = ceil( log_10( 2^(alpha - e - 1) ) )
+    //        = ceil( (alpha - e - 1) * log_10(2) )
+    //
+    // From the paper:
+    // "In theory the result of the procedure could be wrong since c is rounded,
+    //  and the computation itself is approximated [...]. In practice, however,
+    //  this simple function is sufficient."
+    //
+    // For IEEE double precision floating-point numbers converted into
+    // normalized diyfp's w = f * 2^e, with q = 64,
+    //
+    //      e >= -1022      (min IEEE exponent)
+    //           -52        (p - 1)
+    //           -52        (p - 1, possibly normalize denormal IEEE numbers)
+    //           -11        (normalize the diyfp)
+    //         = -1137
+    //
+    // and
+    //
+    //      e <= +1023      (max IEEE exponent)
+    //           -52        (p - 1)
+    //           -11        (normalize the diyfp)
+    //         = 960
+    //
+    // This binary exponent range [-1137,960] results in a decimal exponent
+    // range [-307,324]. One does not need to store a cached power for each
+    // k in this range. For each such k it suffices to find a cached power
+    // such that the exponent of the product lies in [alpha,gamma].
+    // This implies that the difference of the decimal exponents of adjacent
+    // table entries must be less than or equal to
+    //
+    //      floor( (gamma - alpha) * log_10(2) ) = 8.
+    //
+    // (A smaller distance gamma-alpha would require a larger table.)
+
+    // NB:
+    // Actually this function returns c, such that -60 <= e_c + e + 64 <= -34.
+
+    constexpr int kCachedPowersMinDecExp = -300;
+    constexpr int kCachedPowersDecStep = 8;
+
+    static constexpr std::array<cached_power, 79> kCachedPowers =
+    {
+        {
+            { 0xAB70FE17C79AC6CA, -1060, -300 },
+            { 0xFF77B1FCBEBCDC4F, -1034, -292 },
+            { 0xBE5691EF416BD60C, -1007, -284 },
+            { 0x8DD01FAD907FFC3C,  -980, -276 },
+            { 0xD3515C2831559A83,  -954, -268 },
+            { 0x9D71AC8FADA6C9B5,  -927, -260 },
+            { 0xEA9C227723EE8BCB,  -901, -252 },
+            { 0xAECC49914078536D,  -874, -244 },
+            { 0x823C12795DB6CE57,  -847, -236 },
+            { 0xC21094364DFB5637,  -821, -228 },
+            { 0x9096EA6F3848984F,  -794, -220 },
+            { 0xD77485CB25823AC7,  -768, -212 },
+            { 0xA086CFCD97BF97F4,  -741, -204 },
+            { 0xEF340A98172AACE5,  -715, -196 },
+            { 0xB23867FB2A35B28E,  -688, -188 },
+            { 0x84C8D4DFD2C63F3B,  -661, -180 },
+            { 0xC5DD44271AD3CDBA,  -635, -172 },
+            { 0x936B9FCEBB25C996,  -608, -164 },
+            { 0xDBAC6C247D62A584,  -582, -156 },
+            { 0xA3AB66580D5FDAF6,  -555, -148 },
+            { 0xF3E2F893DEC3F126,  -529, -140 },
+            { 0xB5B5ADA8AAFF80B8,  -502, -132 },
+            { 0x87625F056C7C4A8B,  -475, -124 },
+            { 0xC9BCFF6034C13053,  -449, -116 },
+            { 0x964E858C91BA2655,  -422, -108 },
+            { 0xDFF9772470297EBD,  -396, -100 },
+            { 0xA6DFBD9FB8E5B88F,  -369,  -92 },
+            { 0xF8A95FCF88747D94,  -343,  -84 },
+            { 0xB94470938FA89BCF,  -316,  -76 },
+            { 0x8A08F0F8BF0F156B,  -289,  -68 },
+            { 0xCDB02555653131B6,  -263,  -60 },
+            { 0x993FE2C6D07B7FAC,  -236,  -52 },
+            { 0xE45C10C42A2B3B06,  -210,  -44 },
+            { 0xAA242499697392D3,  -183,  -36 },
+            { 0xFD87B5F28300CA0E,  -157,  -28 },
+            { 0xBCE5086492111AEB,  -130,  -20 },
+            { 0x8CBCCC096F5088CC,  -103,  -12 },
+            { 0xD1B71758E219652C,   -77,   -4 },
+            { 0x9C40000000000000,   -50,    4 },
+            { 0xE8D4A51000000000,   -24,   12 },
+            { 0xAD78EBC5AC620000,     3,   20 },
+            { 0x813F3978F8940984,    30,   28 },
+            { 0xC097CE7BC90715B3,    56,   36 },
+            { 0x8F7E32CE7BEA5C70,    83,   44 },
+            { 0xD5D238A4ABE98068,   109,   52 },
+            { 0x9F4F2726179A2245,   136,   60 },
+            { 0xED63A231D4C4FB27,   162,   68 },
+            { 0xB0DE65388CC8ADA8,   189,   76 },
+            { 0x83C7088E1AAB65DB,   216,   84 },
+            { 0xC45D1DF942711D9A,   242,   92 },
+            { 0x924D692CA61BE758,   269,  100 },
+            { 0xDA01EE641A708DEA,   295,  108 },
+            { 0xA26DA3999AEF774A,   322,  116 },
+            { 0xF209787BB47D6B85,   348,  124 },
+            { 0xB454E4A179DD1877,   375,  132 },
+            { 0x865B86925B9BC5C2,   402,  140 },
+            { 0xC83553C5C8965D3D,   428,  148 },
+            { 0x952AB45CFA97A0B3,   455,  156 },
+            { 0xDE469FBD99A05FE3,   481,  164 },
+            { 0xA59BC234DB398C25,   508,  172 },
+            { 0xF6C69A72A3989F5C,   534,  180 },
+            { 0xB7DCBF5354E9BECE,   561,  188 },
+            { 0x88FCF317F22241E2,   588,  196 },
+            { 0xCC20CE9BD35C78A5,   614,  204 },
+            { 0x98165AF37B2153DF,   641,  212 },
+            { 0xE2A0B5DC971F303A,   667,  220 },
+            { 0xA8D9D1535CE3B396,   694,  228 },
+            { 0xFB9B7CD9A4A7443C,   720,  236 },
+            { 0xBB764C4CA7A44410,   747,  244 },
+            { 0x8BAB8EEFB6409C1A,   774,  252 },
+            { 0xD01FEF10A657842C,   800,  260 },
+            { 0x9B10A4E5E9913129,   827,  268 },
+            { 0xE7109BFBA19C0C9D,   853,  276 },
+            { 0xAC2820D9623BF429,   880,  284 },
+            { 0x80444B5E7AA7CF85,   907,  292 },
+            { 0xBF21E44003ACDD2D,   933,  300 },
+            { 0x8E679C2F5E44FF8F,   960,  308 },
+            { 0xD433179D9C8CB841,   986,  316 },
+            { 0x9E19DB92B4E31BA9,  1013,  324 },
+        }
+    };
+
+    // This computation gives exactly the same results for k as
+    //      k = ceil((kAlpha - e - 1) * 0.30102999566398114)
+    // for |e| <= 1500, but doesn't require floating-point operations.
+    // NB: log_10(2) ~= 78913 / 2^18
+    JSON_ASSERT(e >= -1500);
+    JSON_ASSERT(e <=  1500);
+    const int f = kAlpha - e - 1;
+    const int k = (f * 78913) / (1 << 18) + static_cast<int>(f > 0);
+
+    const int index = (-kCachedPowersMinDecExp + k + (kCachedPowersDecStep - 1)) / kCachedPowersDecStep;
+    JSON_ASSERT(index >= 0);
+    JSON_ASSERT(static_cast<std::size_t>(index) < kCachedPowers.size());
+
+    const cached_power cached = kCachedPowers[static_cast<std::size_t>(index)];
+    JSON_ASSERT(kAlpha <= cached.e + e + 64);
+    JSON_ASSERT(kGamma >= cached.e + e + 64);
+
+    return cached;
+}
+
+/*!
+For n != 0, returns k, such that pow10 := 10^(k-1) <= n < 10^k.
+For n == 0, returns 1 and sets pow10 := 1.
+*/
+inline int find_largest_pow10(const std::uint32_t n, std::uint32_t& pow10)
+{
+    // LCOV_EXCL_START
+    if (n >= 1000000000)
+    {
+        pow10 = 1000000000;
+        return 10;
+    }
+    // LCOV_EXCL_STOP
+    if (n >= 100000000)
+    {
+        pow10 = 100000000;
+        return  9;
+    }
+    if (n >= 10000000)
+    {
+        pow10 = 10000000;
+        return  8;
+    }
+    if (n >= 1000000)
+    {
+        pow10 = 1000000;
+        return  7;
+    }
+    if (n >= 100000)
+    {
+        pow10 = 100000;
+        return  6;
+    }
+    if (n >= 10000)
+    {
+        pow10 = 10000;
+        return  5;
+    }
+    if (n >= 1000)
+    {
+        pow10 = 1000;
+        return  4;
+    }
+    if (n >= 100)
+    {
+        pow10 = 100;
+        return  3;
+    }
+    if (n >= 10)
+    {
+        pow10 = 10;
+        return  2;
+    }
+
+    pow10 = 1;
+    return 1;
+}
+
+inline void grisu2_round(char* buf, int len, std::uint64_t dist, std::uint64_t delta,
+                         std::uint64_t rest, std::uint64_t ten_k)
+{
+    JSON_ASSERT(len >= 1);
+    JSON_ASSERT(dist <= delta);
+    JSON_ASSERT(rest <= delta);
+    JSON_ASSERT(ten_k > 0);
+
+    //               <--------------------------- delta ---->
+    //                                  <---- dist --------->
+    // --------------[------------------+-------------------]--------------
+    //               M-                 w                   M+
+    //
+    //                                  ten_k
+    //                                <------>
+    //                                       <---- rest ---->
+    // --------------[------------------+----+--------------]--------------
+    //                                  w    V
+    //                                       = buf * 10^k
+    //
+    // ten_k represents a unit-in-the-last-place in the decimal representation
+    // stored in buf.
+    // Decrement buf by ten_k while this takes buf closer to w.
+
+    // The tests are written in this order to avoid overflow in unsigned
+    // integer arithmetic.
+
+    while (rest < dist
+            && delta - rest >= ten_k
+            && (rest + ten_k < dist || dist - rest > rest + ten_k - dist))
+    {
+        JSON_ASSERT(buf[len - 1] != '0');
+        buf[len - 1]--;
+        rest += ten_k;
+    }
+}
+
+/*!
+Generates V = buffer * 10^decimal_exponent, such that M- <= V <= M+.
+M- and M+ must be normalized and share the same exponent -60 <= e <= -32.
+*/
+inline void grisu2_digit_gen(char* buffer, int& length, int& decimal_exponent,
+                             diyfp M_minus, diyfp w, diyfp M_plus)
+{
+    static_assert(kAlpha >= -60, "internal error");
+    static_assert(kGamma <= -32, "internal error");
+
+    // Generates the digits (and the exponent) of a decimal floating-point
+    // number V = buffer * 10^decimal_exponent in the range [M-, M+]. The diyfp's
+    // w, M- and M+ share the same exponent e, which satisfies alpha <= e <= gamma.
+    //
+    //               <--------------------------- delta ---->
+    //                                  <---- dist --------->
+    // --------------[------------------+-------------------]--------------
+    //               M-                 w                   M+
+    //
+    // Grisu2 generates the digits of M+ from left to right and stops as soon as
+    // V is in [M-,M+].
+
+    JSON_ASSERT(M_plus.e >= kAlpha);
+    JSON_ASSERT(M_plus.e <= kGamma);
+
+    std::uint64_t delta = diyfp::sub(M_plus, M_minus).f; // (significand of (M+ - M-), implicit exponent is e)
+    std::uint64_t dist  = diyfp::sub(M_plus, w      ).f; // (significand of (M+ - w ), implicit exponent is e)
+
+    // Split M+ = f * 2^e into two parts p1 and p2 (note: e < 0):
+    //
+    //      M+ = f * 2^e
+    //         = ((f div 2^-e) * 2^-e + (f mod 2^-e)) * 2^e
+    //         = ((p1        ) * 2^-e + (p2        )) * 2^e
+    //         = p1 + p2 * 2^e
+
+    const diyfp one(std::uint64_t{1} << -M_plus.e, M_plus.e);
+
+    auto p1 = static_cast<std::uint32_t>(M_plus.f >> -one.e); // p1 = f div 2^-e (Since -e >= 32, p1 fits into a 32-bit int.)
+    std::uint64_t p2 = M_plus.f & (one.f - 1);                    // p2 = f mod 2^-e
+
+    // 1)
+    //
+    // Generate the digits of the integral part p1 = d[n-1]...d[1]d[0]
+
+    JSON_ASSERT(p1 > 0);
+
+    std::uint32_t pow10{};
+    const int k = find_largest_pow10(p1, pow10);
+
+    //      10^(k-1) <= p1 < 10^k, pow10 = 10^(k-1)
+    //
+    //      p1 = (p1 div 10^(k-1)) * 10^(k-1) + (p1 mod 10^(k-1))
+    //         = (d[k-1]         ) * 10^(k-1) + (p1 mod 10^(k-1))
+    //
+    //      M+ = p1                                             + p2 * 2^e
+    //         = d[k-1] * 10^(k-1) + (p1 mod 10^(k-1))          + p2 * 2^e
+    //         = d[k-1] * 10^(k-1) + ((p1 mod 10^(k-1)) * 2^-e + p2) * 2^e
+    //         = d[k-1] * 10^(k-1) + (                         rest) * 2^e
+    //
+    // Now generate the digits d[n] of p1 from left to right (n = k-1,...,0)
+    //
+    //      p1 = d[k-1]...d[n] * 10^n + d[n-1]...d[0]
+    //
+    // but stop as soon as
+    //
+    //      rest * 2^e = (d[n-1]...d[0] * 2^-e + p2) * 2^e <= delta * 2^e
+
+    int n = k;
+    while (n > 0)
+    {
+        // Invariants:
+        //      M+ = buffer * 10^n + (p1 + p2 * 2^e)    (buffer = 0 for n = k)
+        //      pow10 = 10^(n-1) <= p1 < 10^n
+        //
+        const std::uint32_t d = p1 / pow10;  // d = p1 div 10^(n-1)
+        const std::uint32_t r = p1 % pow10;  // r = p1 mod 10^(n-1)
+        //
+        //      M+ = buffer * 10^n + (d * 10^(n-1) + r) + p2 * 2^e
+        //         = (buffer * 10 + d) * 10^(n-1) + (r + p2 * 2^e)
+        //
+        JSON_ASSERT(d <= 9);
+        buffer[length++] = static_cast<char>('0' + d); // buffer := buffer * 10 + d
+        //
+        //      M+ = buffer * 10^(n-1) + (r + p2 * 2^e)
+        //
+        p1 = r;
+        n--;
+        //
+        //      M+ = buffer * 10^n + (p1 + p2 * 2^e)
+        //      pow10 = 10^n
+        //
+
+        // Now check if enough digits have been generated.
+        // Compute
+        //
+        //      p1 + p2 * 2^e = (p1 * 2^-e + p2) * 2^e = rest * 2^e
+        //
+        // Note:
+        // Since rest and delta share the same exponent e, it suffices to
+        // compare the significands.
+        const std::uint64_t rest = (std::uint64_t{p1} << -one.e) + p2;
+        if (rest <= delta)
+        {
+            // V = buffer * 10^n, with M- <= V <= M+.
+
+            decimal_exponent += n;
+
+            // We may now just stop. But instead look if the buffer could be
+            // decremented to bring V closer to w.
+            //
+            // pow10 = 10^n is now 1 ulp in the decimal representation V.
+            // The rounding procedure works with diyfp's with an implicit
+            // exponent of e.
+            //
+            //      10^n = (10^n * 2^-e) * 2^e = ulp * 2^e
+            //
+            const std::uint64_t ten_n = std::uint64_t{pow10} << -one.e;
+            grisu2_round(buffer, length, dist, delta, rest, ten_n);
+
+            return;
+        }
+
+        pow10 /= 10;
+        //
+        //      pow10 = 10^(n-1) <= p1 < 10^n
+        // Invariants restored.
+    }
+
+    // 2)
+    //
+    // The digits of the integral part have been generated:
+    //
+    //      M+ = d[k-1]...d[1]d[0] + p2 * 2^e
+    //         = buffer            + p2 * 2^e
+    //
+    // Now generate the digits of the fractional part p2 * 2^e.
+    //
+    // Note:
+    // No decimal point is generated: the exponent is adjusted instead.
+    //
+    // p2 actually represents the fraction
+    //
+    //      p2 * 2^e
+    //          = p2 / 2^-e
+    //          = d[-1] / 10^1 + d[-2] / 10^2 + ...
+    //
+    // Now generate the digits d[-m] of p1 from left to right (m = 1,2,...)
+    //
+    //      p2 * 2^e = d[-1]d[-2]...d[-m] * 10^-m
+    //                      + 10^-m * (d[-m-1] / 10^1 + d[-m-2] / 10^2 + ...)
+    //
+    // using
+    //
+    //      10^m * p2 = ((10^m * p2) div 2^-e) * 2^-e + ((10^m * p2) mod 2^-e)
+    //                = (                   d) * 2^-e + (                   r)
+    //
+    // or
+    //      10^m * p2 * 2^e = d + r * 2^e
+    //
+    // i.e.
+    //
+    //      M+ = buffer + p2 * 2^e
+    //         = buffer + 10^-m * (d + r * 2^e)
+    //         = (buffer * 10^m + d) * 10^-m + 10^-m * r * 2^e
+    //
+    // and stop as soon as 10^-m * r * 2^e <= delta * 2^e
+
+    JSON_ASSERT(p2 > delta);
+
+    int m = 0;
+    for (;;)
+    {
+        // Invariant:
+        //      M+ = buffer * 10^-m + 10^-m * (d[-m-1] / 10 + d[-m-2] / 10^2 + ...) * 2^e
+        //         = buffer * 10^-m + 10^-m * (p2                                 ) * 2^e
+        //         = buffer * 10^-m + 10^-m * (1/10 * (10 * p2)                   ) * 2^e
+        //         = buffer * 10^-m + 10^-m * (1/10 * ((10*p2 div 2^-e) * 2^-e + (10*p2 mod 2^-e)) * 2^e
+        //
+        JSON_ASSERT(p2 <= (std::numeric_limits<std::uint64_t>::max)() / 10);
+        p2 *= 10;
+        const std::uint64_t d = p2 >> -one.e;     // d = (10 * p2) div 2^-e
+        const std::uint64_t r = p2 & (one.f - 1); // r = (10 * p2) mod 2^-e
+        //
+        //      M+ = buffer * 10^-m + 10^-m * (1/10 * (d * 2^-e + r) * 2^e
+        //         = buffer * 10^-m + 10^-m * (1/10 * (d + r * 2^e))
+        //         = (buffer * 10 + d) * 10^(-m-1) + 10^(-m-1) * r * 2^e
+        //
+        JSON_ASSERT(d <= 9);
+        buffer[length++] = static_cast<char>('0' + d); // buffer := buffer * 10 + d
+        //
+        //      M+ = buffer * 10^(-m-1) + 10^(-m-1) * r * 2^e
+        //
+        p2 = r;
+        m++;
+        //
+        //      M+ = buffer * 10^-m + 10^-m * p2 * 2^e
+        // Invariant restored.
+
+        // Check if enough digits have been generated.
+        //
+        //      10^-m * p2 * 2^e <= delta * 2^e
+        //              p2 * 2^e <= 10^m * delta * 2^e
+        //                    p2 <= 10^m * delta
+        delta *= 10;
+        dist  *= 10;
+        if (p2 <= delta)
+        {
+            break;
+        }
+    }
+
+    // V = buffer * 10^-m, with M- <= V <= M+.
+
+    decimal_exponent -= m;
+
+    // 1 ulp in the decimal representation is now 10^-m.
+    // Since delta and dist are now scaled by 10^m, we need to do the
+    // same with ulp in order to keep the units in sync.
+    //
+    //      10^m * 10^-m = 1 = 2^-e * 2^e = ten_m * 2^e
+    //
+    const std::uint64_t ten_m = one.f;
+    grisu2_round(buffer, length, dist, delta, p2, ten_m);
+
+    // By construction this algorithm generates the shortest possible decimal
+    // number (Loitsch, Theorem 6.2) which rounds back to w.
+    // For an input number of precision p, at least
+    //
+    //      N = 1 + ceil(p * log_10(2))
+    //
+    // decimal digits are sufficient to identify all binary floating-point
+    // numbers (Matula, "In-and-Out conversions").
+    // This implies that the algorithm does not produce more than N decimal
+    // digits.
+    //
+    //      N = 17 for p = 53 (IEEE double precision)
+    //      N = 9  for p = 24 (IEEE single precision)
+}
+
+/*!
+v = buf * 10^decimal_exponent
+len is the length of the buffer (number of decimal digits)
+The buffer must be large enough, i.e. >= max_digits10.
+*/
+JSON_HEDLEY_NON_NULL(1)
+inline void grisu2(char* buf, int& len, int& decimal_exponent,
+                   diyfp m_minus, diyfp v, diyfp m_plus)
+{
+    JSON_ASSERT(m_plus.e == m_minus.e);
+    JSON_ASSERT(m_plus.e == v.e);
+
+    //  --------(-----------------------+-----------------------)--------    (A)
+    //          m-                      v                       m+
+    //
+    //  --------------------(-----------+-----------------------)--------    (B)
+    //                      m-          v                       m+
+    //
+    // First scale v (and m- and m+) such that the exponent is in the range
+    // [alpha, gamma].
+
+    const cached_power cached = get_cached_power_for_binary_exponent(m_plus.e);
+
+    const diyfp c_minus_k(cached.f, cached.e); // = c ~= 10^-k
+
+    // The exponent of the products is = v.e + c_minus_k.e + q and is in the range [alpha,gamma]
+    const diyfp w       = diyfp::mul(v,       c_minus_k);
+    const diyfp w_minus = diyfp::mul(m_minus, c_minus_k);
+    const diyfp w_plus  = diyfp::mul(m_plus,  c_minus_k);
+
+    //  ----(---+---)---------------(---+---)---------------(---+---)----
+    //          w-                      w                       w+
+    //          = c*m-                  = c*v                   = c*m+
+    //
+    // diyfp::mul rounds its result and c_minus_k is approximated too. w, w- and
+    // w+ are now off by a small amount.
+    // In fact:
+    //
+    //      w - v * 10^k < 1 ulp
+    //
+    // To account for this inaccuracy, add resp. subtract 1 ulp.
+    //
+    //  --------+---[---------------(---+---)---------------]---+--------
+    //          w-  M-                  w                   M+  w+
+    //
+    // Now any number in [M-, M+] (bounds included) will round to w when input,
+    // regardless of how the input rounding algorithm breaks ties.
+    //
+    // And digit_gen generates the shortest possible such number in [M-, M+].
+    // Note that this does not mean that Grisu2 always generates the shortest
+    // possible number in the interval (m-, m+).
+    const diyfp M_minus(w_minus.f + 1, w_minus.e);
+    const diyfp M_plus (w_plus.f  - 1, w_plus.e );
+
+    decimal_exponent = -cached.k; // = -(-k) = k
+
+    grisu2_digit_gen(buf, len, decimal_exponent, M_minus, w, M_plus);
+}
+
+/*!
+v = buf * 10^decimal_exponent
+len is the length of the buffer (number of decimal digits)
+The buffer must be large enough, i.e. >= max_digits10.
+*/
+template<typename FloatType>
+JSON_HEDLEY_NON_NULL(1)
+void grisu2(char* buf, int& len, int& decimal_exponent, FloatType value)
+{
+    static_assert(diyfp::kPrecision >= std::numeric_limits<FloatType>::digits + 3,
+                  "internal error: not enough precision");
+
+    JSON_ASSERT(std::isfinite(value));
+    JSON_ASSERT(value > 0);
+
+    // If the neighbors (and boundaries) of 'value' are always computed for double-precision
+    // numbers, all float's can be recovered using strtod (and strtof). However, the resulting
+    // decimal representations are not exactly "short".
+    //
+    // The documentation for 'std::to_chars' (https://en.cppreference.com/w/cpp/utility/to_chars)
+    // says "value is converted to a string as if by std::sprintf in the default ("C") locale"
+    // and since sprintf promotes floats to doubles, I think this is exactly what 'std::to_chars'
+    // does.
+    // On the other hand, the documentation for 'std::to_chars' requires that "parsing the
+    // representation using the corresponding std::from_chars function recovers value exactly". That
+    // indicates that single precision floating-point numbers should be recovered using
+    // 'std::strtof'.
+    //
+    // NB: If the neighbors are computed for single-precision numbers, there is a single float
+    //     (7.0385307e-26f) which can't be recovered using strtod. The resulting double precision
+    //     value is off by 1 ulp.
+#if 0
+    const boundaries w = compute_boundaries(static_cast<double>(value));
+#else
+    const boundaries w = compute_boundaries(value);
+#endif
+
+    grisu2(buf, len, decimal_exponent, w.minus, w.w, w.plus);
+}
+
+/*!
+@brief appends a decimal representation of e to buf
+@return a pointer to the element following the exponent.
+@pre -1000 < e < 1000
+*/
+JSON_HEDLEY_NON_NULL(1)
+JSON_HEDLEY_RETURNS_NON_NULL
+inline char* append_exponent(char* buf, int e)
+{
+    JSON_ASSERT(e > -1000);
+    JSON_ASSERT(e <  1000);
+
+    if (e < 0)
+    {
+        e = -e;
+        *buf++ = '-';
+    }
+    else
+    {
+        *buf++ = '+';
+    }
+
+    auto k = static_cast<std::uint32_t>(e);
+    if (k < 10)
+    {
+        // Always print at least two digits in the exponent.
+        // This is for compatibility with printf("%g").
+        *buf++ = '0';
+        *buf++ = static_cast<char>('0' + k);
+    }
+    else if (k < 100)
+    {
+        *buf++ = static_cast<char>('0' + k / 10);
+        k %= 10;
+        *buf++ = static_cast<char>('0' + k);
+    }
+    else
+    {
+        *buf++ = static_cast<char>('0' + k / 100);
+        k %= 100;
+        *buf++ = static_cast<char>('0' + k / 10);
+        k %= 10;
+        *buf++ = static_cast<char>('0' + k);
+    }
+
+    return buf;
+}
+
+/*!
+@brief prettify v = buf * 10^decimal_exponent
+
+If v is in the range [10^min_exp, 10^max_exp) it will be printed in fixed-point
+notation. Otherwise it will be printed in exponential notation.
+
+@pre min_exp < 0
+@pre max_exp > 0
+*/
+JSON_HEDLEY_NON_NULL(1)
+JSON_HEDLEY_RETURNS_NON_NULL
+inline char* format_buffer(char* buf, int len, int decimal_exponent,
+                           int min_exp, int max_exp)
+{
+    JSON_ASSERT(min_exp < 0);
+    JSON_ASSERT(max_exp > 0);
+
+    const int k = len;
+    const int n = len + decimal_exponent;
+
+    // v = buf * 10^(n-k)
+    // k is the length of the buffer (number of decimal digits)
+    // n is the position of the decimal point relative to the start of the buffer.
+
+    if (k <= n && n <= max_exp)
+    {
+        // digits[000]
+        // len <= max_exp + 2
+
+        std::memset(buf + k, '0', static_cast<size_t>(n) - static_cast<size_t>(k));
+        // Make it look like a floating-point number (#362, #378)
+        buf[n + 0] = '.';
+        buf[n + 1] = '0';
+        return buf + (static_cast<size_t>(n) + 2);
+    }
+
+    if (0 < n && n <= max_exp)
+    {
+        // dig.its
+        // len <= max_digits10 + 1
+
+        JSON_ASSERT(k > n);
+
+        std::memmove(buf + (static_cast<size_t>(n) + 1), buf + n, static_cast<size_t>(k) - static_cast<size_t>(n));
+        buf[n] = '.';
+        return buf + (static_cast<size_t>(k) + 1U);
+    }
+
+    if (min_exp < n && n <= 0)
+    {
+        // 0.[000]digits
+        // len <= 2 + (-min_exp - 1) + max_digits10
+
+        std::memmove(buf + (2 + static_cast<size_t>(-n)), buf, static_cast<size_t>(k));
+        buf[0] = '0';
+        buf[1] = '.';
+        std::memset(buf + 2, '0', static_cast<size_t>(-n));
+        return buf + (2U + static_cast<size_t>(-n) + static_cast<size_t>(k));
+    }
+
+    if (k == 1)
+    {
+        // dE+123
+        // len <= 1 + 5
+
+        buf += 1;
+    }
+    else
+    {
+        // d.igitsE+123
+        // len <= max_digits10 + 1 + 5
+
+        std::memmove(buf + 2, buf + 1, static_cast<size_t>(k) - 1);
+        buf[1] = '.';
+        buf += 1 + static_cast<size_t>(k);
+    }
+
+    *buf++ = 'e';
+    return append_exponent(buf, n - 1);
+}
+
+} // namespace dtoa_impl
+
+/*!
+@brief generates a decimal representation of the floating-point number value in [first, last).
+
+The format of the resulting decimal representation is similar to printf's %g
+format. Returns an iterator pointing past-the-end of the decimal representation.
+
+@note The input number must be finite, i.e. NaN's and Inf's are not supported.
+@note The buffer must be large enough.
+@note The result is NOT null-terminated.
+*/
+template<typename FloatType>
+JSON_HEDLEY_NON_NULL(1, 2)
+JSON_HEDLEY_RETURNS_NON_NULL
+char* to_chars(char* first, const char* last, FloatType value)
+{
+    static_cast<void>(last); // maybe unused - fix warning
+    JSON_ASSERT(std::isfinite(value));
+
+    // Use signbit(value) instead of (value < 0) since signbit works for -0.
+    if (std::signbit(value))
+    {
+        value = -value;
+        *first++ = '-';
+    }
+
+#ifdef __GNUC__
+#pragma GCC diagnostic push
+#pragma GCC diagnostic ignored "-Wfloat-equal"
+#endif
+    if (value == 0) // +-0
+    {
+        *first++ = '0';
+        // Make it look like a floating-point number (#362, #378)
+        *first++ = '.';
+        *first++ = '0';
+        return first;
+    }
+#ifdef __GNUC__
+#pragma GCC diagnostic pop
+#endif
+
+    JSON_ASSERT(last - first >= std::numeric_limits<FloatType>::max_digits10);
+
+    // Compute v = buffer * 10^decimal_exponent.
+    // The decimal digits are stored in the buffer, which needs to be interpreted
+    // as an unsigned decimal integer.
+    // len is the length of the buffer, i.e. the number of decimal digits.
+    int len = 0;
+    int decimal_exponent = 0;
+    dtoa_impl::grisu2(first, len, decimal_exponent, value);
+
+    JSON_ASSERT(len <= std::numeric_limits<FloatType>::max_digits10);
+
+    // Format the buffer like printf("%.*g", prec, value)
+    constexpr int kMinExp = -4;
+    // Use digits10 here to increase compatibility with version 2.
+    constexpr int kMaxExp = std::numeric_limits<FloatType>::digits10;
+
+    JSON_ASSERT(last - first >= kMaxExp + 2);
+    JSON_ASSERT(last - first >= 2 + (-kMinExp - 1) + std::numeric_limits<FloatType>::max_digits10);
+    JSON_ASSERT(last - first >= std::numeric_limits<FloatType>::max_digits10 + 6);
+
+    return dtoa_impl::format_buffer(first, len, decimal_exponent, kMinExp, kMaxExp);
+}
+
+} // namespace detail
+} // namespace nlohmann
+
+// #include <nlohmann/detail/exceptions.hpp>
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+// #include <nlohmann/detail/meta/cpp_future.hpp>
+
+// #include <nlohmann/detail/output/binary_writer.hpp>
+
+// #include <nlohmann/detail/output/output_adapters.hpp>
+
+// #include <nlohmann/detail/value_t.hpp>
+
+
+namespace nlohmann
+{
+namespace detail
+{
+///////////////////
+// serialization //
+///////////////////
+
+/// how to treat decoding errors
+enum class error_handler_t
+{
+    strict,  ///< throw a type_error exception in case of invalid UTF-8
+    replace, ///< replace invalid UTF-8 sequences with U+FFFD
+    ignore   ///< ignore invalid UTF-8 sequences
+};
+
+template<typename BasicJsonType>
+class serializer
+{
+    using string_t = typename BasicJsonType::string_t;
+    using number_float_t = typename BasicJsonType::number_float_t;
+    using number_integer_t = typename BasicJsonType::number_integer_t;
+    using number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+    using binary_char_t = typename BasicJsonType::binary_t::value_type;
+    static constexpr std::uint8_t UTF8_ACCEPT = 0;
+    static constexpr std::uint8_t UTF8_REJECT = 1;
+
+  public:
+    /*!
+    @param[in] s  output stream to serialize to
+    @param[in] ichar  indentation character to use
+    @param[in] error_handler_  how to react on decoding errors
+    */
+    serializer(output_adapter_t<char> s, const char ichar,
+               error_handler_t error_handler_ = error_handler_t::strict)
+        : o(std::move(s))
+        , loc(std::localeconv())
+        , thousands_sep(loc->thousands_sep == nullptr ? '\0' : std::char_traits<char>::to_char_type(* (loc->thousands_sep)))
+        , decimal_point(loc->decimal_point == nullptr ? '\0' : std::char_traits<char>::to_char_type(* (loc->decimal_point)))
+        , indent_char(ichar)
+        , indent_string(512, indent_char)
+        , error_handler(error_handler_)
+    {}
+
+    // delete because of pointer members
+    serializer(const serializer&) = delete;
+    serializer& operator=(const serializer&) = delete;
+    serializer(serializer&&) = delete;
+    serializer& operator=(serializer&&) = delete;
+    ~serializer() = default;
+
+    /*!
+    @brief internal implementation of the serialization function
+
+    This function is called by the public member function dump and organizes
+    the serialization internally. The indentation level is propagated as
+    additional parameter. In case of arrays and objects, the function is
+    called recursively.
+
+    - strings and object keys are escaped using `escape_string()`
+    - integer numbers are converted implicitly via `operator<<`
+    - floating-point numbers are converted to a string using `"%g"` format
+    - binary values are serialized as objects containing the subtype and the
+      byte array
+
+    @param[in] val               value to serialize
+    @param[in] pretty_print      whether the output shall be pretty-printed
+    @param[in] ensure_ascii If @a ensure_ascii is true, all non-ASCII characters
+    in the output are escaped with `\uXXXX` sequences, and the result consists
+    of ASCII characters only.
+    @param[in] indent_step       the indent level
+    @param[in] current_indent    the current indent level (only used internally)
+    */
+    void dump(const BasicJsonType& val,
+              const bool pretty_print,
+              const bool ensure_ascii,
+              const unsigned int indent_step,
+              const unsigned int current_indent = 0)
+    {
+        switch (val.m_type)
+        {
+            case value_t::object:
+            {
+                if (val.m_value.object->empty())
+                {
+                    o->write_characters("{}", 2);
+                    return;
+                }
+
+                if (pretty_print)
+                {
+                    o->write_characters("{\n", 2);
+
+                    // variable to hold indentation for recursive calls
+                    const auto new_indent = current_indent + indent_step;
+                    if (JSON_HEDLEY_UNLIKELY(indent_string.size() < new_indent))
+                    {
+                        indent_string.resize(indent_string.size() * 2, ' ');
+                    }
+
+                    // first n-1 elements
+                    auto i = val.m_value.object->cbegin();
+                    for (std::size_t cnt = 0; cnt < val.m_value.object->size() - 1; ++cnt, ++i)
+                    {
+                        o->write_characters(indent_string.c_str(), new_indent);
+                        o->write_character('\"');
+                        dump_escaped(i->first, ensure_ascii);
+                        o->write_characters("\": ", 3);
+                        dump(i->second, true, ensure_ascii, indent_step, new_indent);
+                        o->write_characters(",\n", 2);
+                    }
+
+                    // last element
+                    JSON_ASSERT(i != val.m_value.object->cend());
+                    JSON_ASSERT(std::next(i) == val.m_value.object->cend());
+                    o->write_characters(indent_string.c_str(), new_indent);
+                    o->write_character('\"');
+                    dump_escaped(i->first, ensure_ascii);
+                    o->write_characters("\": ", 3);
+                    dump(i->second, true, ensure_ascii, indent_step, new_indent);
+
+                    o->write_character('\n');
+                    o->write_characters(indent_string.c_str(), current_indent);
+                    o->write_character('}');
+                }
+                else
+                {
+                    o->write_character('{');
+
+                    // first n-1 elements
+                    auto i = val.m_value.object->cbegin();
+                    for (std::size_t cnt = 0; cnt < val.m_value.object->size() - 1; ++cnt, ++i)
+                    {
+                        o->write_character('\"');
+                        dump_escaped(i->first, ensure_ascii);
+                        o->write_characters("\":", 2);
+                        dump(i->second, false, ensure_ascii, indent_step, current_indent);
+                        o->write_character(',');
+                    }
+
+                    // last element
+                    JSON_ASSERT(i != val.m_value.object->cend());
+                    JSON_ASSERT(std::next(i) == val.m_value.object->cend());
+                    o->write_character('\"');
+                    dump_escaped(i->first, ensure_ascii);
+                    o->write_characters("\":", 2);
+                    dump(i->second, false, ensure_ascii, indent_step, current_indent);
+
+                    o->write_character('}');
+                }
+
+                return;
+            }
+
+            case value_t::array:
+            {
+                if (val.m_value.array->empty())
+                {
+                    o->write_characters("[]", 2);
+                    return;
+                }
+
+                if (pretty_print)
+                {
+                    o->write_characters("[\n", 2);
+
+                    // variable to hold indentation for recursive calls
+                    const auto new_indent = current_indent + indent_step;
+                    if (JSON_HEDLEY_UNLIKELY(indent_string.size() < new_indent))
+                    {
+                        indent_string.resize(indent_string.size() * 2, ' ');
+                    }
+
+                    // first n-1 elements
+                    for (auto i = val.m_value.array->cbegin();
+                            i != val.m_value.array->cend() - 1; ++i)
+                    {
+                        o->write_characters(indent_string.c_str(), new_indent);
+                        dump(*i, true, ensure_ascii, indent_step, new_indent);
+                        o->write_characters(",\n", 2);
+                    }
+
+                    // last element
+                    JSON_ASSERT(!val.m_value.array->empty());
+                    o->write_characters(indent_string.c_str(), new_indent);
+                    dump(val.m_value.array->back(), true, ensure_ascii, indent_step, new_indent);
+
+                    o->write_character('\n');
+                    o->write_characters(indent_string.c_str(), current_indent);
+                    o->write_character(']');
+                }
+                else
+                {
+                    o->write_character('[');
+
+                    // first n-1 elements
+                    for (auto i = val.m_value.array->cbegin();
+                            i != val.m_value.array->cend() - 1; ++i)
+                    {
+                        dump(*i, false, ensure_ascii, indent_step, current_indent);
+                        o->write_character(',');
+                    }
+
+                    // last element
+                    JSON_ASSERT(!val.m_value.array->empty());
+                    dump(val.m_value.array->back(), false, ensure_ascii, indent_step, current_indent);
+
+                    o->write_character(']');
+                }
+
+                return;
+            }
+
+            case value_t::string:
+            {
+                o->write_character('\"');
+                dump_escaped(*val.m_value.string, ensure_ascii);
+                o->write_character('\"');
+                return;
+            }
+
+            case value_t::binary:
+            {
+                if (pretty_print)
+                {
+                    o->write_characters("{\n", 2);
+
+                    // variable to hold indentation for recursive calls
+                    const auto new_indent = current_indent + indent_step;
+                    if (JSON_HEDLEY_UNLIKELY(indent_string.size() < new_indent))
+                    {
+                        indent_string.resize(indent_string.size() * 2, ' ');
+                    }
+
+                    o->write_characters(indent_string.c_str(), new_indent);
+
+                    o->write_characters("\"bytes\": [", 10);
+
+                    if (!val.m_value.binary->empty())
+                    {
+                        for (auto i = val.m_value.binary->cbegin();
+                                i != val.m_value.binary->cend() - 1; ++i)
+                        {
+                            dump_integer(*i);
+                            o->write_characters(", ", 2);
+                        }
+                        dump_integer(val.m_value.binary->back());
+                    }
+
+                    o->write_characters("],\n", 3);
+                    o->write_characters(indent_string.c_str(), new_indent);
+
+                    o->write_characters("\"subtype\": ", 11);
+                    if (val.m_value.binary->has_subtype())
+                    {
+                        dump_integer(val.m_value.binary->subtype());
+                    }
+                    else
+                    {
+                        o->write_characters("null", 4);
+                    }
+                    o->write_character('\n');
+                    o->write_characters(indent_string.c_str(), current_indent);
+                    o->write_character('}');
+                }
+                else
+                {
+                    o->write_characters("{\"bytes\":[", 10);
+
+                    if (!val.m_value.binary->empty())
+                    {
+                        for (auto i = val.m_value.binary->cbegin();
+                                i != val.m_value.binary->cend() - 1; ++i)
+                        {
+                            dump_integer(*i);
+                            o->write_character(',');
+                        }
+                        dump_integer(val.m_value.binary->back());
+                    }
+
+                    o->write_characters("],\"subtype\":", 12);
+                    if (val.m_value.binary->has_subtype())
+                    {
+                        dump_integer(val.m_value.binary->subtype());
+                        o->write_character('}');
+                    }
+                    else
+                    {
+                        o->write_characters("null}", 5);
+                    }
+                }
+                return;
+            }
+
+            case value_t::boolean:
+            {
+                if (val.m_value.boolean)
+                {
+                    o->write_characters("true", 4);
+                }
+                else
+                {
+                    o->write_characters("false", 5);
+                }
+                return;
+            }
+
+            case value_t::number_integer:
+            {
+                dump_integer(val.m_value.number_integer);
+                return;
+            }
+
+            case value_t::number_unsigned:
+            {
+                dump_integer(val.m_value.number_unsigned);
+                return;
+            }
+
+            case value_t::number_float:
+            {
+                dump_float(val.m_value.number_float);
+                return;
+            }
+
+            case value_t::discarded:
+            {
+                o->write_characters("<discarded>", 11);
+                return;
+            }
+
+            case value_t::null:
+            {
+                o->write_characters("null", 4);
+                return;
+            }
+
+            default:            // LCOV_EXCL_LINE
+                JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+        }
+    }
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    /*!
+    @brief dump escaped string
+
+    Escape a string by replacing certain special characters by a sequence of an
+    escape character (backslash) and another character and other control
+    characters by a sequence of "\u" followed by a four-digit hex
+    representation. The escaped string is written to output stream @a o.
+
+    @param[in] s  the string to escape
+    @param[in] ensure_ascii  whether to escape non-ASCII characters with
+                             \uXXXX sequences
+
+    @complexity Linear in the length of string @a s.
+    */
+    void dump_escaped(const string_t& s, const bool ensure_ascii)
+    {
+        std::uint32_t codepoint{};
+        std::uint8_t state = UTF8_ACCEPT;
+        std::size_t bytes = 0;  // number of bytes written to string_buffer
+
+        // number of bytes written at the point of the last valid byte
+        std::size_t bytes_after_last_accept = 0;
+        std::size_t undumped_chars = 0;
+
+        for (std::size_t i = 0; i < s.size(); ++i)
+        {
+            const auto byte = static_cast<std::uint8_t>(s[i]);
+
+            switch (decode(state, codepoint, byte))
+            {
+                case UTF8_ACCEPT:  // decode found a new code point
+                {
+                    switch (codepoint)
+                    {
+                        case 0x08: // backspace
+                        {
+                            string_buffer[bytes++] = '\\';
+                            string_buffer[bytes++] = 'b';
+                            break;
+                        }
+
+                        case 0x09: // horizontal tab
+                        {
+                            string_buffer[bytes++] = '\\';
+                            string_buffer[bytes++] = 't';
+                            break;
+                        }
+
+                        case 0x0A: // newline
+                        {
+                            string_buffer[bytes++] = '\\';
+                            string_buffer[bytes++] = 'n';
+                            break;
+                        }
+
+                        case 0x0C: // formfeed
+                        {
+                            string_buffer[bytes++] = '\\';
+                            string_buffer[bytes++] = 'f';
+                            break;
+                        }
+
+                        case 0x0D: // carriage return
+                        {
+                            string_buffer[bytes++] = '\\';
+                            string_buffer[bytes++] = 'r';
+                            break;
+                        }
+
+                        case 0x22: // quotation mark
+                        {
+                            string_buffer[bytes++] = '\\';
+                            string_buffer[bytes++] = '\"';
+                            break;
+                        }
+
+                        case 0x5C: // reverse solidus
+                        {
+                            string_buffer[bytes++] = '\\';
+                            string_buffer[bytes++] = '\\';
+                            break;
+                        }
+
+                        default:
+                        {
+                            // escape control characters (0x00..0x1F) or, if
+                            // ensure_ascii parameter is used, non-ASCII characters
+                            if ((codepoint <= 0x1F) || (ensure_ascii && (codepoint >= 0x7F)))
+                            {
+                                if (codepoint <= 0xFFFF)
+                                {
+                                    // NOLINTNEXTLINE(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+                                    static_cast<void>((std::snprintf)(string_buffer.data() + bytes, 7, "\\u%04x",
+                                                                      static_cast<std::uint16_t>(codepoint)));
+                                    bytes += 6;
+                                }
+                                else
+                                {
+                                    // NOLINTNEXTLINE(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+                                    static_cast<void>((std::snprintf)(string_buffer.data() + bytes, 13, "\\u%04x\\u%04x",
+                                                                      static_cast<std::uint16_t>(0xD7C0u + (codepoint >> 10u)),
+                                                                      static_cast<std::uint16_t>(0xDC00u + (codepoint & 0x3FFu))));
+                                    bytes += 12;
+                                }
+                            }
+                            else
+                            {
+                                // copy byte to buffer (all previous bytes
+                                // been copied have in default case above)
+                                string_buffer[bytes++] = s[i];
+                            }
+                            break;
+                        }
+                    }
+
+                    // write buffer and reset index; there must be 13 bytes
+                    // left, as this is the maximal number of bytes to be
+                    // written ("\uxxxx\uxxxx\0") for one code point
+                    if (string_buffer.size() - bytes < 13)
+                    {
+                        o->write_characters(string_buffer.data(), bytes);
+                        bytes = 0;
+                    }
+
+                    // remember the byte position of this accept
+                    bytes_after_last_accept = bytes;
+                    undumped_chars = 0;
+                    break;
+                }
+
+                case UTF8_REJECT:  // decode found invalid UTF-8 byte
+                {
+                    switch (error_handler)
+                    {
+                        case error_handler_t::strict:
+                        {
+                            std::stringstream ss;
+                            ss << std::uppercase << std::setfill('0') << std::setw(2) << std::hex << (byte | 0);
+                            JSON_THROW(type_error::create(316, "invalid UTF-8 byte at index " + std::to_string(i) + ": 0x" + ss.str(), BasicJsonType()));
+                        }
+
+                        case error_handler_t::ignore:
+                        case error_handler_t::replace:
+                        {
+                            // in case we saw this character the first time, we
+                            // would like to read it again, because the byte
+                            // may be OK for itself, but just not OK for the
+                            // previous sequence
+                            if (undumped_chars > 0)
+                            {
+                                --i;
+                            }
+
+                            // reset length buffer to the last accepted index;
+                            // thus removing/ignoring the invalid characters
+                            bytes = bytes_after_last_accept;
+
+                            if (error_handler == error_handler_t::replace)
+                            {
+                                // add a replacement character
+                                if (ensure_ascii)
+                                {
+                                    string_buffer[bytes++] = '\\';
+                                    string_buffer[bytes++] = 'u';
+                                    string_buffer[bytes++] = 'f';
+                                    string_buffer[bytes++] = 'f';
+                                    string_buffer[bytes++] = 'f';
+                                    string_buffer[bytes++] = 'd';
+                                }
+                                else
+                                {
+                                    string_buffer[bytes++] = detail::binary_writer<BasicJsonType, char>::to_char_type('\xEF');
+                                    string_buffer[bytes++] = detail::binary_writer<BasicJsonType, char>::to_char_type('\xBF');
+                                    string_buffer[bytes++] = detail::binary_writer<BasicJsonType, char>::to_char_type('\xBD');
+                                }
+
+                                // write buffer and reset index; there must be 13 bytes
+                                // left, as this is the maximal number of bytes to be
+                                // written ("\uxxxx\uxxxx\0") for one code point
+                                if (string_buffer.size() - bytes < 13)
+                                {
+                                    o->write_characters(string_buffer.data(), bytes);
+                                    bytes = 0;
+                                }
+
+                                bytes_after_last_accept = bytes;
+                            }
+
+                            undumped_chars = 0;
+
+                            // continue processing the string
+                            state = UTF8_ACCEPT;
+                            break;
+                        }
+
+                        default:            // LCOV_EXCL_LINE
+                            JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+                    }
+                    break;
+                }
+
+                default:  // decode found yet incomplete multi-byte code point
+                {
+                    if (!ensure_ascii)
+                    {
+                        // code point will not be escaped - copy byte to buffer
+                        string_buffer[bytes++] = s[i];
+                    }
+                    ++undumped_chars;
+                    break;
+                }
+            }
+        }
+
+        // we finished processing the string
+        if (JSON_HEDLEY_LIKELY(state == UTF8_ACCEPT))
+        {
+            // write buffer
+            if (bytes > 0)
+            {
+                o->write_characters(string_buffer.data(), bytes);
+            }
+        }
+        else
+        {
+            // we finish reading, but do not accept: string was incomplete
+            switch (error_handler)
+            {
+                case error_handler_t::strict:
+                {
+                    std::stringstream ss;
+                    ss << std::uppercase << std::setfill('0') << std::setw(2) << std::hex << (static_cast<std::uint8_t>(s.back()) | 0);
+                    JSON_THROW(type_error::create(316, "incomplete UTF-8 string; last byte: 0x" + ss.str(), BasicJsonType()));
+                }
+
+                case error_handler_t::ignore:
+                {
+                    // write all accepted bytes
+                    o->write_characters(string_buffer.data(), bytes_after_last_accept);
+                    break;
+                }
+
+                case error_handler_t::replace:
+                {
+                    // write all accepted bytes
+                    o->write_characters(string_buffer.data(), bytes_after_last_accept);
+                    // add a replacement character
+                    if (ensure_ascii)
+                    {
+                        o->write_characters("\\ufffd", 6);
+                    }
+                    else
+                    {
+                        o->write_characters("\xEF\xBF\xBD", 3);
+                    }
+                    break;
+                }
+
+                default:            // LCOV_EXCL_LINE
+                    JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+            }
+        }
+    }
+
+  private:
+    /*!
+    @brief count digits
+
+    Count the number of decimal (base 10) digits for an input unsigned integer.
+
+    @param[in] x  unsigned integer number to count its digits
+    @return    number of decimal digits
+    */
+    inline unsigned int count_digits(number_unsigned_t x) noexcept
+    {
+        unsigned int n_digits = 1;
+        for (;;)
+        {
+            if (x < 10)
+            {
+                return n_digits;
+            }
+            if (x < 100)
+            {
+                return n_digits + 1;
+            }
+            if (x < 1000)
+            {
+                return n_digits + 2;
+            }
+            if (x < 10000)
+            {
+                return n_digits + 3;
+            }
+            x = x / 10000u;
+            n_digits += 4;
+        }
+    }
+
+    // templates to avoid warnings about useless casts
+    template <typename NumberType, enable_if_t<std::is_signed<NumberType>::value, int> = 0>
+    bool is_negative_number(NumberType x)
+    {
+        return x < 0;
+    }
+
+    template < typename NumberType, enable_if_t <std::is_unsigned<NumberType>::value, int > = 0 >
+    bool is_negative_number(NumberType /*unused*/)
+    {
+        return false;
+    }
+
+    /*!
+    @brief dump an integer
+
+    Dump a given integer to output stream @a o. Works internally with
+    @a number_buffer.
+
+    @param[in] x  integer number (signed or unsigned) to dump
+    @tparam NumberType either @a number_integer_t or @a number_unsigned_t
+    */
+    template < typename NumberType, detail::enable_if_t <
+                   std::is_integral<NumberType>::value ||
+                   std::is_same<NumberType, number_unsigned_t>::value ||
+                   std::is_same<NumberType, number_integer_t>::value ||
+                   std::is_same<NumberType, binary_char_t>::value,
+                   int > = 0 >
+    void dump_integer(NumberType x)
+    {
+        static constexpr std::array<std::array<char, 2>, 100> digits_to_99
+        {
+            {
+                {{'0', '0'}}, {{'0', '1'}}, {{'0', '2'}}, {{'0', '3'}}, {{'0', '4'}}, {{'0', '5'}}, {{'0', '6'}}, {{'0', '7'}}, {{'0', '8'}}, {{'0', '9'}},
+                {{'1', '0'}}, {{'1', '1'}}, {{'1', '2'}}, {{'1', '3'}}, {{'1', '4'}}, {{'1', '5'}}, {{'1', '6'}}, {{'1', '7'}}, {{'1', '8'}}, {{'1', '9'}},
+                {{'2', '0'}}, {{'2', '1'}}, {{'2', '2'}}, {{'2', '3'}}, {{'2', '4'}}, {{'2', '5'}}, {{'2', '6'}}, {{'2', '7'}}, {{'2', '8'}}, {{'2', '9'}},
+                {{'3', '0'}}, {{'3', '1'}}, {{'3', '2'}}, {{'3', '3'}}, {{'3', '4'}}, {{'3', '5'}}, {{'3', '6'}}, {{'3', '7'}}, {{'3', '8'}}, {{'3', '9'}},
+                {{'4', '0'}}, {{'4', '1'}}, {{'4', '2'}}, {{'4', '3'}}, {{'4', '4'}}, {{'4', '5'}}, {{'4', '6'}}, {{'4', '7'}}, {{'4', '8'}}, {{'4', '9'}},
+                {{'5', '0'}}, {{'5', '1'}}, {{'5', '2'}}, {{'5', '3'}}, {{'5', '4'}}, {{'5', '5'}}, {{'5', '6'}}, {{'5', '7'}}, {{'5', '8'}}, {{'5', '9'}},
+                {{'6', '0'}}, {{'6', '1'}}, {{'6', '2'}}, {{'6', '3'}}, {{'6', '4'}}, {{'6', '5'}}, {{'6', '6'}}, {{'6', '7'}}, {{'6', '8'}}, {{'6', '9'}},
+                {{'7', '0'}}, {{'7', '1'}}, {{'7', '2'}}, {{'7', '3'}}, {{'7', '4'}}, {{'7', '5'}}, {{'7', '6'}}, {{'7', '7'}}, {{'7', '8'}}, {{'7', '9'}},
+                {{'8', '0'}}, {{'8', '1'}}, {{'8', '2'}}, {{'8', '3'}}, {{'8', '4'}}, {{'8', '5'}}, {{'8', '6'}}, {{'8', '7'}}, {{'8', '8'}}, {{'8', '9'}},
+                {{'9', '0'}}, {{'9', '1'}}, {{'9', '2'}}, {{'9', '3'}}, {{'9', '4'}}, {{'9', '5'}}, {{'9', '6'}}, {{'9', '7'}}, {{'9', '8'}}, {{'9', '9'}},
+            }
+        };
+
+        // special case for "0"
+        if (x == 0)
+        {
+            o->write_character('0');
+            return;
+        }
+
+        // use a pointer to fill the buffer
+        auto buffer_ptr = number_buffer.begin(); // NOLINT(llvm-qualified-auto,readability-qualified-auto,cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+
+        number_unsigned_t abs_value;
+
+        unsigned int n_chars{};
+
+        if (is_negative_number(x))
+        {
+            *buffer_ptr = '-';
+            abs_value = remove_sign(static_cast<number_integer_t>(x));
+
+            // account one more byte for the minus sign
+            n_chars = 1 + count_digits(abs_value);
+        }
+        else
+        {
+            abs_value = static_cast<number_unsigned_t>(x);
+            n_chars = count_digits(abs_value);
+        }
+
+        // spare 1 byte for '\0'
+        JSON_ASSERT(n_chars < number_buffer.size() - 1);
+
+        // jump to the end to generate the string from backward,
+        // so we later avoid reversing the result
+        buffer_ptr += n_chars;
+
+        // Fast int2ascii implementation inspired by "Fastware" talk by Andrei Alexandrescu
+        // See: https://www.youtube.com/watch?v=o4-CwDo2zpg
+        while (abs_value >= 100)
+        {
+            const auto digits_index = static_cast<unsigned>((abs_value % 100));
+            abs_value /= 100;
+            *(--buffer_ptr) = digits_to_99[digits_index][1];
+            *(--buffer_ptr) = digits_to_99[digits_index][0];
+        }
+
+        if (abs_value >= 10)
+        {
+            const auto digits_index = static_cast<unsigned>(abs_value);
+            *(--buffer_ptr) = digits_to_99[digits_index][1];
+            *(--buffer_ptr) = digits_to_99[digits_index][0];
+        }
+        else
+        {
+            *(--buffer_ptr) = static_cast<char>('0' + abs_value);
+        }
+
+        o->write_characters(number_buffer.data(), n_chars);
+    }
+
+    /*!
+    @brief dump a floating-point number
+
+    Dump a given floating-point number to output stream @a o. Works internally
+    with @a number_buffer.
+
+    @param[in] x  floating-point number to dump
+    */
+    void dump_float(number_float_t x)
+    {
+        // NaN / inf
+        if (!std::isfinite(x))
+        {
+            o->write_characters("null", 4);
+            return;
+        }
+
+        // If number_float_t is an IEEE-754 single or double precision number,
+        // use the Grisu2 algorithm to produce short numbers which are
+        // guaranteed to round-trip, using strtof and strtod, resp.
+        //
+        // NB: The test below works if <long double> == <double>.
+        static constexpr bool is_ieee_single_or_double
+            = (std::numeric_limits<number_float_t>::is_iec559 && std::numeric_limits<number_float_t>::digits == 24 && std::numeric_limits<number_float_t>::max_exponent == 128) ||
+              (std::numeric_limits<number_float_t>::is_iec559 && std::numeric_limits<number_float_t>::digits == 53 && std::numeric_limits<number_float_t>::max_exponent == 1024);
+
+        dump_float(x, std::integral_constant<bool, is_ieee_single_or_double>());
+    }
+
+    void dump_float(number_float_t x, std::true_type /*is_ieee_single_or_double*/)
+    {
+        auto* begin = number_buffer.data();
+        auto* end = ::nlohmann::detail::to_chars(begin, begin + number_buffer.size(), x);
+
+        o->write_characters(begin, static_cast<size_t>(end - begin));
+    }
+
+    void dump_float(number_float_t x, std::false_type /*is_ieee_single_or_double*/)
+    {
+        // get number of digits for a float -> text -> float round-trip
+        static constexpr auto d = std::numeric_limits<number_float_t>::max_digits10;
+
+        // the actual conversion
+        // NOLINTNEXTLINE(cppcoreguidelines-pro-type-vararg,hicpp-vararg)
+        std::ptrdiff_t len = (std::snprintf)(number_buffer.data(), number_buffer.size(), "%.*g", d, x);
+
+        // negative value indicates an error
+        JSON_ASSERT(len > 0);
+        // check if buffer was large enough
+        JSON_ASSERT(static_cast<std::size_t>(len) < number_buffer.size());
+
+        // erase thousands separator
+        if (thousands_sep != '\0')
+        {
+            // NOLINTNEXTLINE(readability-qualified-auto,llvm-qualified-auto): std::remove returns an iterator, see https://github.com/nlohmann/json/issues/3081
+            const auto end = std::remove(number_buffer.begin(), number_buffer.begin() + len, thousands_sep);
+            std::fill(end, number_buffer.end(), '\0');
+            JSON_ASSERT((end - number_buffer.begin()) <= len);
+            len = (end - number_buffer.begin());
+        }
+
+        // convert decimal point to '.'
+        if (decimal_point != '\0' && decimal_point != '.')
+        {
+            // NOLINTNEXTLINE(readability-qualified-auto,llvm-qualified-auto): std::find returns an iterator, see https://github.com/nlohmann/json/issues/3081
+            const auto dec_pos = std::find(number_buffer.begin(), number_buffer.end(), decimal_point);
+            if (dec_pos != number_buffer.end())
+            {
+                *dec_pos = '.';
+            }
+        }
+
+        o->write_characters(number_buffer.data(), static_cast<std::size_t>(len));
+
+        // determine if we need to append ".0"
+        const bool value_is_int_like =
+            std::none_of(number_buffer.begin(), number_buffer.begin() + len + 1,
+                         [](char c)
+        {
+            return c == '.' || c == 'e';
+        });
+
+        if (value_is_int_like)
+        {
+            o->write_characters(".0", 2);
+        }
+    }
+
+    /*!
+    @brief check whether a string is UTF-8 encoded
+
+    The function checks each byte of a string whether it is UTF-8 encoded. The
+    result of the check is stored in the @a state parameter. The function must
+    be called initially with state 0 (accept). State 1 means the string must
+    be rejected, because the current byte is not allowed. If the string is
+    completely processed, but the state is non-zero, the string ended
+    prematurely; that is, the last byte indicated more bytes should have
+    followed.
+
+    @param[in,out] state  the state of the decoding
+    @param[in,out] codep  codepoint (valid only if resulting state is UTF8_ACCEPT)
+    @param[in] byte       next byte to decode
+    @return               new state
+
+    @note The function has been edited: a std::array is used.
+
+    @copyright Copyright (c) 2008-2009 Bjoern Hoehrmann <bjoern@hoehrmann.de>
+    @sa http://bjoern.hoehrmann.de/utf-8/decoder/dfa/
+    */
+    static std::uint8_t decode(std::uint8_t& state, std::uint32_t& codep, const std::uint8_t byte) noexcept
+    {
+        static const std::array<std::uint8_t, 400> utf8d =
+        {
+            {
+                0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, // 00..1F
+                0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, // 20..3F
+                0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, // 40..5F
+                0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, // 60..7F
+                1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, 9, // 80..9F
+                7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, // A0..BF
+                8, 8, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, // C0..DF
+                0xA, 0x3, 0x3, 0x3, 0x3, 0x3, 0x3, 0x3, 0x3, 0x3, 0x3, 0x3, 0x3, 0x4, 0x3, 0x3, // E0..EF
+                0xB, 0x6, 0x6, 0x6, 0x5, 0x8, 0x8, 0x8, 0x8, 0x8, 0x8, 0x8, 0x8, 0x8, 0x8, 0x8, // F0..FF
+                0x0, 0x1, 0x2, 0x3, 0x5, 0x8, 0x7, 0x1, 0x1, 0x1, 0x4, 0x6, 0x1, 0x1, 0x1, 0x1, // s0..s0
+                1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1, 1, // s1..s2
+                1, 2, 1, 1, 1, 1, 1, 2, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, // s3..s4
+                1, 2, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 3, 1, 3, 1, 1, 1, 1, 1, 1, // s5..s6
+                1, 3, 1, 1, 1, 1, 1, 3, 1, 3, 1, 1, 1, 1, 1, 1, 1, 3, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 // s7..s8
+            }
+        };
+
+        JSON_ASSERT(byte < utf8d.size());
+        const std::uint8_t type = utf8d[byte];
+
+        codep = (state != UTF8_ACCEPT)
+                ? (byte & 0x3fu) | (codep << 6u)
+                : (0xFFu >> type) & (byte);
+
+        std::size_t index = 256u + static_cast<size_t>(state) * 16u + static_cast<size_t>(type);
+        JSON_ASSERT(index < 400);
+        state = utf8d[index];
+        return state;
+    }
+
+    /*
+     * Overload to make the compiler happy while it is instantiating
+     * dump_integer for number_unsigned_t.
+     * Must never be called.
+     */
+    number_unsigned_t remove_sign(number_unsigned_t x)
+    {
+        JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+        return x; // LCOV_EXCL_LINE
+    }
+
+    /*
+     * Helper function for dump_integer
+     *
+     * This function takes a negative signed integer and returns its absolute
+     * value as unsigned integer. The plus/minus shuffling is necessary as we can
+     * not directly remove the sign of an arbitrary signed integer as the
+     * absolute values of INT_MIN and INT_MAX are usually not the same. See
+     * #1708 for details.
+     */
+    inline number_unsigned_t remove_sign(number_integer_t x) noexcept
+    {
+        JSON_ASSERT(x < 0 && x < (std::numeric_limits<number_integer_t>::max)()); // NOLINT(misc-redundant-expression)
+        return static_cast<number_unsigned_t>(-(x + 1)) + 1;
+    }
+
+  private:
+    /// the output of the serializer
+    output_adapter_t<char> o = nullptr;
+
+    /// a (hopefully) large enough character buffer
+    std::array<char, 64> number_buffer{{}};
+
+    /// the locale
+    const std::lconv* loc = nullptr;
+    /// the locale's thousand separator character
+    const char thousands_sep = '\0';
+    /// the locale's decimal point character
+    const char decimal_point = '\0';
+
+    /// string buffer
+    std::array<char, 512> string_buffer{{}};
+
+    /// the indentation character
+    const char indent_char;
+    /// the indentation string
+    string_t indent_string;
+
+    /// error_handler how to react on decoding errors
+    const error_handler_t error_handler;
+};
+}  // namespace detail
+}  // namespace nlohmann
+
+// #include <nlohmann/detail/value_t.hpp>
+
+// #include <nlohmann/json_fwd.hpp>
+
+// #include <nlohmann/ordered_map.hpp>
+
+
+#include <functional> // less
+#include <initializer_list> // initializer_list
+#include <iterator> // input_iterator_tag, iterator_traits
+#include <memory> // allocator
+#include <stdexcept> // for out_of_range
+#include <type_traits> // enable_if, is_convertible
+#include <utility> // pair
+#include <vector> // vector
+
+// #include <nlohmann/detail/macro_scope.hpp>
+
+
+namespace nlohmann
+{
+
+/// ordered_map: a minimal map-like container that preserves insertion order
+/// for use within nlohmann::basic_json<ordered_map>
+template <class Key, class T, class IgnoredLess = std::less<Key>,
+          class Allocator = std::allocator<std::pair<const Key, T>>>
+                  struct ordered_map : std::vector<std::pair<const Key, T>, Allocator>
+{
+    using key_type = Key;
+    using mapped_type = T;
+    using Container = std::vector<std::pair<const Key, T>, Allocator>;
+    using iterator = typename Container::iterator;
+    using const_iterator = typename Container::const_iterator;
+    using size_type = typename Container::size_type;
+    using value_type = typename Container::value_type;
+
+    // Explicit constructors instead of `using Container::Container`
+    // otherwise older compilers choke on it (GCC <= 5.5, xcode <= 9.4)
+    ordered_map(const Allocator& alloc = Allocator()) : Container{alloc} {}
+    template <class It>
+    ordered_map(It first, It last, const Allocator& alloc = Allocator())
+        : Container{first, last, alloc} {}
+    ordered_map(std::initializer_list<T> init, const Allocator& alloc = Allocator() )
+        : Container{init, alloc} {}
+
+    std::pair<iterator, bool> emplace(const key_type& key, T&& t)
+    {
+        for (auto it = this->begin(); it != this->end(); ++it)
+        {
+            if (it->first == key)
+            {
+                return {it, false};
+            }
+        }
+        Container::emplace_back(key, t);
+        return {--this->end(), true};
+    }
+
+    T& operator[](const Key& key)
+    {
+        return emplace(key, T{}).first->second;
+    }
+
+    const T& operator[](const Key& key) const
+    {
+        return at(key);
+    }
+
+    T& at(const Key& key)
+    {
+        for (auto it = this->begin(); it != this->end(); ++it)
+        {
+            if (it->first == key)
+            {
+                return it->second;
+            }
+        }
+
+        JSON_THROW(std::out_of_range("key not found"));
+    }
+
+    const T& at(const Key& key) const
+    {
+        for (auto it = this->begin(); it != this->end(); ++it)
+        {
+            if (it->first == key)
+            {
+                return it->second;
+            }
+        }
+
+        JSON_THROW(std::out_of_range("key not found"));
+    }
+
+    size_type erase(const Key& key)
+    {
+        for (auto it = this->begin(); it != this->end(); ++it)
+        {
+            if (it->first == key)
+            {
+                // Since we cannot move const Keys, re-construct them in place
+                for (auto next = it; ++next != this->end(); ++it)
+                {
+                    it->~value_type(); // Destroy but keep allocation
+                    new (&*it) value_type{std::move(*next)};
+                }
+                Container::pop_back();
+                return 1;
+            }
+        }
+        return 0;
+    }
+
+    iterator erase(iterator pos)
+    {
+        return erase(pos, std::next(pos));
+    }
+
+    iterator erase(iterator first, iterator last)
+    {
+        const auto elements_affected = std::distance(first, last);
+        const auto offset = std::distance(Container::begin(), first);
+
+        // This is the start situation. We need to delete elements_affected
+        // elements (3 in this example: e, f, g), and need to return an
+        // iterator past the last deleted element (h in this example).
+        // Note that offset is the distance from the start of the vector
+        // to first. We will need this later.
+
+        // [ a, b, c, d, e, f, g, h, i, j ]
+        //               ^        ^
+        //             first    last
+
+        // Since we cannot move const Keys, we re-construct them in place.
+        // We start at first and re-construct (viz. copy) the elements from
+        // the back of the vector. Example for first iteration:
+
+        //               ,--------.
+        //               v        |   destroy e and re-construct with h
+        // [ a, b, c, d, e, f, g, h, i, j ]
+        //               ^        ^
+        //               it       it + elements_affected
+
+        for (auto it = first; std::next(it, elements_affected) != Container::end(); ++it)
+        {
+            it->~value_type(); // destroy but keep allocation
+            new (&*it) value_type{std::move(*std::next(it, elements_affected))}; // "move" next element to it
+        }
+
+        // [ a, b, c, d, h, i, j, h, i, j ]
+        //               ^        ^
+        //             first    last
+
+        // remove the unneeded elements at the end of the vector
+        Container::resize(this->size() - static_cast<size_type>(elements_affected));
+
+        // [ a, b, c, d, h, i, j ]
+        //               ^        ^
+        //             first    last
+
+        // first is now pointing past the last deleted element, but we cannot
+        // use this iterator, because it may have been invalidated by the
+        // resize call. Instead, we can return begin() + offset.
+        return Container::begin() + offset;
+    }
+
+    size_type count(const Key& key) const
+    {
+        for (auto it = this->begin(); it != this->end(); ++it)
+        {
+            if (it->first == key)
+            {
+                return 1;
+            }
+        }
+        return 0;
+    }
+
+    iterator find(const Key& key)
+    {
+        for (auto it = this->begin(); it != this->end(); ++it)
+        {
+            if (it->first == key)
+            {
+                return it;
+            }
+        }
+        return Container::end();
+    }
+
+    const_iterator find(const Key& key) const
+    {
+        for (auto it = this->begin(); it != this->end(); ++it)
+        {
+            if (it->first == key)
+            {
+                return it;
+            }
+        }
+        return Container::end();
+    }
+
+    std::pair<iterator, bool> insert( value_type&& value )
+    {
+        return emplace(value.first, std::move(value.second));
+    }
+
+    std::pair<iterator, bool> insert( const value_type& value )
+    {
+        for (auto it = this->begin(); it != this->end(); ++it)
+        {
+            if (it->first == value.first)
+            {
+                return {it, false};
+            }
+        }
+        Container::push_back(value);
+        return {--this->end(), true};
+    }
+
+    template<typename InputIt>
+    using require_input_iter = typename std::enable_if<std::is_convertible<typename std::iterator_traits<InputIt>::iterator_category,
+            std::input_iterator_tag>::value>::type;
+
+    template<typename InputIt, typename = require_input_iter<InputIt>>
+    void insert(InputIt first, InputIt last)
+    {
+        for (auto it = first; it != last; ++it)
+        {
+            insert(*it);
+        }
+    }
+};
+
+}  // namespace nlohmann
+
+
+#if defined(JSON_HAS_CPP_17)
+    #include <string_view>
+#endif
+
+/*!
+@brief namespace for Niels Lohmann
+@see https://github.com/nlohmann
+@since version 1.0.0
+*/
+namespace nlohmann
+{
+
+/*!
+@brief a class to store JSON values
+
+@internal
+@invariant The member variables @a m_value and @a m_type have the following
+relationship:
+- If `m_type == value_t::object`, then `m_value.object != nullptr`.
+- If `m_type == value_t::array`, then `m_value.array != nullptr`.
+- If `m_type == value_t::string`, then `m_value.string != nullptr`.
+The invariants are checked by member function assert_invariant().
+
+@note ObjectType trick from https://stackoverflow.com/a/9860911
+@endinternal
+
+@since version 1.0.0
+
+@nosubgrouping
+*/
+NLOHMANN_BASIC_JSON_TPL_DECLARATION
+class basic_json // NOLINT(cppcoreguidelines-special-member-functions,hicpp-special-member-functions)
+{
+  private:
+    template<detail::value_t> friend struct detail::external_constructor;
+    friend ::nlohmann::json_pointer<basic_json>;
+
+    template<typename BasicJsonType, typename InputType>
+    friend class ::nlohmann::detail::parser;
+    friend ::nlohmann::detail::serializer<basic_json>;
+    template<typename BasicJsonType>
+    friend class ::nlohmann::detail::iter_impl;
+    template<typename BasicJsonType, typename CharType>
+    friend class ::nlohmann::detail::binary_writer;
+    template<typename BasicJsonType, typename InputType, typename SAX>
+    friend class ::nlohmann::detail::binary_reader;
+    template<typename BasicJsonType>
+    friend class ::nlohmann::detail::json_sax_dom_parser;
+    template<typename BasicJsonType>
+    friend class ::nlohmann::detail::json_sax_dom_callback_parser;
+    friend class ::nlohmann::detail::exception;
+
+    /// workaround type for MSVC
+    using basic_json_t = NLOHMANN_BASIC_JSON_TPL;
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    // convenience aliases for types residing in namespace detail;
+    using lexer = ::nlohmann::detail::lexer_base<basic_json>;
+
+    template<typename InputAdapterType>
+    static ::nlohmann::detail::parser<basic_json, InputAdapterType> parser(
+        InputAdapterType adapter,
+        detail::parser_callback_t<basic_json>cb = nullptr,
+        const bool allow_exceptions = true,
+        const bool ignore_comments = false
+                                 )
+    {
+        return ::nlohmann::detail::parser<basic_json, InputAdapterType>(std::move(adapter),
+                std::move(cb), allow_exceptions, ignore_comments);
+    }
+
+  private:
+    using primitive_iterator_t = ::nlohmann::detail::primitive_iterator_t;
+    template<typename BasicJsonType>
+    using internal_iterator = ::nlohmann::detail::internal_iterator<BasicJsonType>;
+    template<typename BasicJsonType>
+    using iter_impl = ::nlohmann::detail::iter_impl<BasicJsonType>;
+    template<typename Iterator>
+    using iteration_proxy = ::nlohmann::detail::iteration_proxy<Iterator>;
+    template<typename Base> using json_reverse_iterator = ::nlohmann::detail::json_reverse_iterator<Base>;
+
+    template<typename CharType>
+    using output_adapter_t = ::nlohmann::detail::output_adapter_t<CharType>;
+
+    template<typename InputType>
+    using binary_reader = ::nlohmann::detail::binary_reader<basic_json, InputType>;
+    template<typename CharType> using binary_writer = ::nlohmann::detail::binary_writer<basic_json, CharType>;
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    using serializer = ::nlohmann::detail::serializer<basic_json>;
+
+  public:
+    using value_t = detail::value_t;
+    /// JSON Pointer, see @ref nlohmann::json_pointer
+    using json_pointer = ::nlohmann::json_pointer<basic_json>;
+    template<typename T, typename SFINAE>
+    using json_serializer = JSONSerializer<T, SFINAE>;
+    /// how to treat decoding errors
+    using error_handler_t = detail::error_handler_t;
+    /// how to treat CBOR tags
+    using cbor_tag_handler_t = detail::cbor_tag_handler_t;
+    /// helper type for initializer lists of basic_json values
+    using initializer_list_t = std::initializer_list<detail::json_ref<basic_json>>;
+
+    using input_format_t = detail::input_format_t;
+    /// SAX interface type, see @ref nlohmann::json_sax
+    using json_sax_t = json_sax<basic_json>;
+
+    ////////////////
+    // exceptions //
+    ////////////////
+
+    /// @name exceptions
+    /// Classes to implement user-defined exceptions.
+    /// @{
+
+    using exception = detail::exception;
+    using parse_error = detail::parse_error;
+    using invalid_iterator = detail::invalid_iterator;
+    using type_error = detail::type_error;
+    using out_of_range = detail::out_of_range;
+    using other_error = detail::other_error;
+
+    /// @}
+
+
+    /////////////////////
+    // container types //
+    /////////////////////
+
+    /// @name container types
+    /// The canonic container types to use @ref basic_json like any other STL
+    /// container.
+    /// @{
+
+    /// the type of elements in a basic_json container
+    using value_type = basic_json;
+
+    /// the type of an element reference
+    using reference = value_type&;
+    /// the type of an element const reference
+    using const_reference = const value_type&;
+
+    /// a type to represent differences between iterators
+    using difference_type = std::ptrdiff_t;
+    /// a type to represent container sizes
+    using size_type = std::size_t;
+
+    /// the allocator type
+    using allocator_type = AllocatorType<basic_json>;
+
+    /// the type of an element pointer
+    using pointer = typename std::allocator_traits<allocator_type>::pointer;
+    /// the type of an element const pointer
+    using const_pointer = typename std::allocator_traits<allocator_type>::const_pointer;
+
+    /// an iterator for a basic_json container
+    using iterator = iter_impl<basic_json>;
+    /// a const iterator for a basic_json container
+    using const_iterator = iter_impl<const basic_json>;
+    /// a reverse iterator for a basic_json container
+    using reverse_iterator = json_reverse_iterator<typename basic_json::iterator>;
+    /// a const reverse iterator for a basic_json container
+    using const_reverse_iterator = json_reverse_iterator<typename basic_json::const_iterator>;
+
+    /// @}
+
+
+    /// @brief returns the allocator associated with the container
+    /// @sa https://json.nlohmann.me/api/basic_json/get_allocator/
+    static allocator_type get_allocator()
+    {
+        return allocator_type();
+    }
+
+    /// @brief returns version information on the library
+    /// @sa https://json.nlohmann.me/api/basic_json/meta/
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json meta()
+    {
+        basic_json result;
+
+        result["copyright"] = "(C) 2013-2022 Niels Lohmann";
+        result["name"] = "JSON for Modern C++";
+        result["url"] = "https://github.com/nlohmann/json";
+        result["version"]["string"] =
+            std::to_string(NLOHMANN_JSON_VERSION_MAJOR) + "." +
+            std::to_string(NLOHMANN_JSON_VERSION_MINOR) + "." +
+            std::to_string(NLOHMANN_JSON_VERSION_PATCH);
+        result["version"]["major"] = NLOHMANN_JSON_VERSION_MAJOR;
+        result["version"]["minor"] = NLOHMANN_JSON_VERSION_MINOR;
+        result["version"]["patch"] = NLOHMANN_JSON_VERSION_PATCH;
+
+#ifdef _WIN32
+        result["platform"] = "win32";
+#elif defined __linux__
+        result["platform"] = "linux";
+#elif defined __APPLE__
+        result["platform"] = "apple";
+#elif defined __unix__
+        result["platform"] = "unix";
+#else
+        result["platform"] = "unknown";
+#endif
+
+#if defined(__ICC) || defined(__INTEL_COMPILER)
+        result["compiler"] = {{"family", "icc"}, {"version", __INTEL_COMPILER}};
+#elif defined(__clang__)
+        result["compiler"] = {{"family", "clang"}, {"version", __clang_version__}};
+#elif defined(__GNUC__) || defined(__GNUG__)
+        result["compiler"] = {{"family", "gcc"}, {"version", std::to_string(__GNUC__) + "." + std::to_string(__GNUC_MINOR__) + "." + std::to_string(__GNUC_PATCHLEVEL__)}};
+#elif defined(__HP_cc) || defined(__HP_aCC)
+        result["compiler"] = "hp"
+#elif defined(__IBMCPP__)
+        result["compiler"] = {{"family", "ilecpp"}, {"version", __IBMCPP__}};
+#elif defined(_MSC_VER)
+        result["compiler"] = {{"family", "msvc"}, {"version", _MSC_VER}};
+#elif defined(__PGI)
+        result["compiler"] = {{"family", "pgcpp"}, {"version", __PGI}};
+#elif defined(__SUNPRO_CC)
+        result["compiler"] = {{"family", "sunpro"}, {"version", __SUNPRO_CC}};
+#else
+        result["compiler"] = {{"family", "unknown"}, {"version", "unknown"}};
+#endif
+
+#ifdef __cplusplus
+        result["compiler"]["c++"] = std::to_string(__cplusplus);
+#else
+        result["compiler"]["c++"] = "unknown";
+#endif
+        return result;
+    }
+
+
+    ///////////////////////////
+    // JSON value data types //
+    ///////////////////////////
+
+    /// @name JSON value data types
+    /// The data types to store a JSON value. These types are derived from
+    /// the template arguments passed to class @ref basic_json.
+    /// @{
+
+    /// @brief object key comparator type
+    /// @sa https://json.nlohmann.me/api/basic_json/object_comparator_t/
+#if defined(JSON_HAS_CPP_14)
+    // Use transparent comparator if possible, combined with perfect forwarding
+    // on find() and count() calls prevents unnecessary string construction.
+    using object_comparator_t = std::less<>;
+#else
+    using object_comparator_t = std::less<StringType>;
+#endif
+
+    /// @brief a type for an object
+    /// @sa https://json.nlohmann.me/api/basic_json/object_t/
+    using object_t = ObjectType<StringType,
+          basic_json,
+          object_comparator_t,
+          AllocatorType<std::pair<const StringType,
+          basic_json>>>;
+
+    /// @brief a type for an array
+    /// @sa https://json.nlohmann.me/api/basic_json/array_t/
+    using array_t = ArrayType<basic_json, AllocatorType<basic_json>>;
+
+    /// @brief a type for a string
+    /// @sa https://json.nlohmann.me/api/basic_json/string_t/
+    using string_t = StringType;
+
+    /// @brief a type for a boolean
+    /// @sa https://json.nlohmann.me/api/basic_json/boolean_t/
+    using boolean_t = BooleanType;
+
+    /// @brief a type for a number (integer)
+    /// @sa https://json.nlohmann.me/api/basic_json/number_integer_t/
+    using number_integer_t = NumberIntegerType;
+
+    /// @brief a type for a number (unsigned)
+    /// @sa https://json.nlohmann.me/api/basic_json/number_unsigned_t/
+    using number_unsigned_t = NumberUnsignedType;
+
+    /// @brief a type for a number (floating-point)
+    /// @sa https://json.nlohmann.me/api/basic_json/number_float_t/
+    using number_float_t = NumberFloatType;
+
+    /// @brief a type for a packed binary type
+    /// @sa https://json.nlohmann.me/api/basic_json/binary_t/
+    using binary_t = nlohmann::byte_container_with_subtype<BinaryType>;
+
+    /// @}
+
+  private:
+
+    /// helper for exception-safe object creation
+    template<typename T, typename... Args>
+    JSON_HEDLEY_RETURNS_NON_NULL
+    static T* create(Args&& ... args)
+    {
+        AllocatorType<T> alloc;
+        using AllocatorTraits = std::allocator_traits<AllocatorType<T>>;
+
+        auto deleter = [&](T * obj)
+        {
+            AllocatorTraits::deallocate(alloc, obj, 1);
+        };
+        std::unique_ptr<T, decltype(deleter)> obj(AllocatorTraits::allocate(alloc, 1), deleter);
+        AllocatorTraits::construct(alloc, obj.get(), std::forward<Args>(args)...);
+        JSON_ASSERT(obj != nullptr);
+        return obj.release();
+    }
+
+    ////////////////////////
+    // JSON value storage //
+    ////////////////////////
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    /*!
+    @brief a JSON value
+
+    The actual storage for a JSON value of the @ref basic_json class. This
+    union combines the different storage types for the JSON value types
+    defined in @ref value_t.
+
+    JSON type | value_t type    | used type
+    --------- | --------------- | ------------------------
+    object    | object          | pointer to @ref object_t
+    array     | array           | pointer to @ref array_t
+    string    | string          | pointer to @ref string_t
+    boolean   | boolean         | @ref boolean_t
+    number    | number_integer  | @ref number_integer_t
+    number    | number_unsigned | @ref number_unsigned_t
+    number    | number_float    | @ref number_float_t
+    binary    | binary          | pointer to @ref binary_t
+    null      | null            | *no value is stored*
+
+    @note Variable-length types (objects, arrays, and strings) are stored as
+    pointers. The size of the union should not exceed 64 bits if the default
+    value types are used.
+
+    @since version 1.0.0
+    */
+    union json_value
+    {
+        /// object (stored with pointer to save storage)
+        object_t* object;
+        /// array (stored with pointer to save storage)
+        array_t* array;
+        /// string (stored with pointer to save storage)
+        string_t* string;
+        /// binary (stored with pointer to save storage)
+        binary_t* binary;
+        /// boolean
+        boolean_t boolean;
+        /// number (integer)
+        number_integer_t number_integer;
+        /// number (unsigned integer)
+        number_unsigned_t number_unsigned;
+        /// number (floating-point)
+        number_float_t number_float;
+
+        /// default constructor (for null values)
+        json_value() = default;
+        /// constructor for booleans
+        json_value(boolean_t v) noexcept : boolean(v) {}
+        /// constructor for numbers (integer)
+        json_value(number_integer_t v) noexcept : number_integer(v) {}
+        /// constructor for numbers (unsigned)
+        json_value(number_unsigned_t v) noexcept : number_unsigned(v) {}
+        /// constructor for numbers (floating-point)
+        json_value(number_float_t v) noexcept : number_float(v) {}
+        /// constructor for empty values of a given type
+        json_value(value_t t)
+        {
+            switch (t)
+            {
+                case value_t::object:
+                {
+                    object = create<object_t>();
+                    break;
+                }
+
+                case value_t::array:
+                {
+                    array = create<array_t>();
+                    break;
+                }
+
+                case value_t::string:
+                {
+                    string = create<string_t>("");
+                    break;
+                }
+
+                case value_t::binary:
+                {
+                    binary = create<binary_t>();
+                    break;
+                }
+
+                case value_t::boolean:
+                {
+                    boolean = static_cast<boolean_t>(false);
+                    break;
+                }
+
+                case value_t::number_integer:
+                {
+                    number_integer = static_cast<number_integer_t>(0);
+                    break;
+                }
+
+                case value_t::number_unsigned:
+                {
+                    number_unsigned = static_cast<number_unsigned_t>(0);
+                    break;
+                }
+
+                case value_t::number_float:
+                {
+                    number_float = static_cast<number_float_t>(0.0);
+                    break;
+                }
+
+                case value_t::null:
+                {
+                    object = nullptr;  // silence warning, see #821
+                    break;
+                }
+
+                case value_t::discarded:
+                default:
+                {
+                    object = nullptr;  // silence warning, see #821
+                    if (JSON_HEDLEY_UNLIKELY(t == value_t::null))
+                    {
+                        JSON_THROW(other_error::create(500, "961c151d2e87f2686a955a9be24d316f1362bf21 3.10.5", basic_json())); // LCOV_EXCL_LINE
+                    }
+                    break;
+                }
+            }
+        }
+
+        /// constructor for strings
+        json_value(const string_t& value) : string(create<string_t>(value)) {}
+
+        /// constructor for rvalue strings
+        json_value(string_t&& value) : string(create<string_t>(std::move(value))) {}
+
+        /// constructor for objects
+        json_value(const object_t& value) : object(create<object_t>(value)) {}
+
+        /// constructor for rvalue objects
+        json_value(object_t&& value) : object(create<object_t>(std::move(value))) {}
+
+        /// constructor for arrays
+        json_value(const array_t& value) : array(create<array_t>(value)) {}
+
+        /// constructor for rvalue arrays
+        json_value(array_t&& value) : array(create<array_t>(std::move(value))) {}
+
+        /// constructor for binary arrays
+        json_value(const typename binary_t::container_type& value) : binary(create<binary_t>(value)) {}
+
+        /// constructor for rvalue binary arrays
+        json_value(typename binary_t::container_type&& value) : binary(create<binary_t>(std::move(value))) {}
+
+        /// constructor for binary arrays (internal type)
+        json_value(const binary_t& value) : binary(create<binary_t>(value)) {}
+
+        /// constructor for rvalue binary arrays (internal type)
+        json_value(binary_t&& value) : binary(create<binary_t>(std::move(value))) {}
+
+        void destroy(value_t t)
+        {
+            if (t == value_t::array || t == value_t::object)
+            {
+                // flatten the current json_value to a heap-allocated stack
+                std::vector<basic_json> stack;
+
+                // move the top-level items to stack
+                if (t == value_t::array)
+                {
+                    stack.reserve(array->size());
+                    std::move(array->begin(), array->end(), std::back_inserter(stack));
+                }
+                else
+                {
+                    stack.reserve(object->size());
+                    for (auto&& it : *object)
+                    {
+                        stack.push_back(std::move(it.second));
+                    }
+                }
+
+                while (!stack.empty())
+                {
+                    // move the last item to local variable to be processed
+                    basic_json current_item(std::move(stack.back()));
+                    stack.pop_back();
+
+                    // if current_item is array/object, move
+                    // its children to the stack to be processed later
+                    if (current_item.is_array())
+                    {
+                        std::move(current_item.m_value.array->begin(), current_item.m_value.array->end(), std::back_inserter(stack));
+
+                        current_item.m_value.array->clear();
+                    }
+                    else if (current_item.is_object())
+                    {
+                        for (auto&& it : *current_item.m_value.object)
+                        {
+                            stack.push_back(std::move(it.second));
+                        }
+
+                        current_item.m_value.object->clear();
+                    }
+
+                    // it's now safe that current_item get destructed
+                    // since it doesn't have any children
+                }
+            }
+
+            switch (t)
+            {
+                case value_t::object:
+                {
+                    AllocatorType<object_t> alloc;
+                    std::allocator_traits<decltype(alloc)>::destroy(alloc, object);
+                    std::allocator_traits<decltype(alloc)>::deallocate(alloc, object, 1);
+                    break;
+                }
+
+                case value_t::array:
+                {
+                    AllocatorType<array_t> alloc;
+                    std::allocator_traits<decltype(alloc)>::destroy(alloc, array);
+                    std::allocator_traits<decltype(alloc)>::deallocate(alloc, array, 1);
+                    break;
+                }
+
+                case value_t::string:
+                {
+                    AllocatorType<string_t> alloc;
+                    std::allocator_traits<decltype(alloc)>::destroy(alloc, string);
+                    std::allocator_traits<decltype(alloc)>::deallocate(alloc, string, 1);
+                    break;
+                }
+
+                case value_t::binary:
+                {
+                    AllocatorType<binary_t> alloc;
+                    std::allocator_traits<decltype(alloc)>::destroy(alloc, binary);
+                    std::allocator_traits<decltype(alloc)>::deallocate(alloc, binary, 1);
+                    break;
+                }
+
+                case value_t::null:
+                case value_t::boolean:
+                case value_t::number_integer:
+                case value_t::number_unsigned:
+                case value_t::number_float:
+                case value_t::discarded:
+                default:
+                {
+                    break;
+                }
+            }
+        }
+    };
+
+  private:
+    /*!
+    @brief checks the class invariants
+
+    This function asserts the class invariants. It needs to be called at the
+    end of every constructor to make sure that created objects respect the
+    invariant. Furthermore, it has to be called each time the type of a JSON
+    value is changed, because the invariant expresses a relationship between
+    @a m_type and @a m_value.
+
+    Furthermore, the parent relation is checked for arrays and objects: If
+    @a check_parents true and the value is an array or object, then the
+    container's elements must have the current value as parent.
+
+    @param[in] check_parents  whether the parent relation should be checked.
+               The value is true by default and should only be set to false
+               during destruction of objects when the invariant does not
+               need to hold.
+    */
+    void assert_invariant(bool check_parents = true) const noexcept
+    {
+        JSON_ASSERT(m_type != value_t::object || m_value.object != nullptr);
+        JSON_ASSERT(m_type != value_t::array || m_value.array != nullptr);
+        JSON_ASSERT(m_type != value_t::string || m_value.string != nullptr);
+        JSON_ASSERT(m_type != value_t::binary || m_value.binary != nullptr);
+
+#if JSON_DIAGNOSTICS
+        JSON_TRY
+        {
+            // cppcheck-suppress assertWithSideEffect
+            JSON_ASSERT(!check_parents || !is_structured() || std::all_of(begin(), end(), [this](const basic_json & j)
+            {
+                return j.m_parent == this;
+            }));
+        }
+        JSON_CATCH(...) {} // LCOV_EXCL_LINE
+#endif
+        static_cast<void>(check_parents);
+    }
+
+    void set_parents()
+    {
+#if JSON_DIAGNOSTICS
+        switch (m_type)
+        {
+            case value_t::array:
+            {
+                for (auto& element : *m_value.array)
+                {
+                    element.m_parent = this;
+                }
+                break;
+            }
+
+            case value_t::object:
+            {
+                for (auto& element : *m_value.object)
+                {
+                    element.second.m_parent = this;
+                }
+                break;
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+                break;
+        }
+#endif
+    }
+
+    iterator set_parents(iterator it, typename iterator::difference_type count_set_parents)
+    {
+#if JSON_DIAGNOSTICS
+        for (typename iterator::difference_type i = 0; i < count_set_parents; ++i)
+        {
+            (it + i)->m_parent = this;
+        }
+#else
+        static_cast<void>(count_set_parents);
+#endif
+        return it;
+    }
+
+    reference set_parent(reference j, std::size_t old_capacity = static_cast<std::size_t>(-1))
+    {
+#if JSON_DIAGNOSTICS
+        if (old_capacity != static_cast<std::size_t>(-1))
+        {
+            // see https://github.com/nlohmann/json/issues/2838
+            JSON_ASSERT(type() == value_t::array);
+            if (JSON_HEDLEY_UNLIKELY(m_value.array->capacity() != old_capacity))
+            {
+                // capacity has changed: update all parents
+                set_parents();
+                return j;
+            }
+        }
+
+        // ordered_json uses a vector internally, so pointers could have
+        // been invalidated; see https://github.com/nlohmann/json/issues/2962
+#ifdef JSON_HEDLEY_MSVC_VERSION
+#pragma warning(push )
+#pragma warning(disable : 4127) // ignore warning to replace if with if constexpr
+#endif
+        if (detail::is_ordered_map<object_t>::value)
+        {
+            set_parents();
+            return j;
+        }
+#ifdef JSON_HEDLEY_MSVC_VERSION
+#pragma warning( pop )
+#endif
+
+        j.m_parent = this;
+#else
+        static_cast<void>(j);
+        static_cast<void>(old_capacity);
+#endif
+        return j;
+    }
+
+  public:
+    //////////////////////////
+    // JSON parser callback //
+    //////////////////////////
+
+    /// @brief parser event types
+    /// @sa https://json.nlohmann.me/api/basic_json/parse_event_t/
+    using parse_event_t = detail::parse_event_t;
+
+    /// @brief per-element parser callback type
+    /// @sa https://json.nlohmann.me/api/basic_json/parser_callback_t/
+    using parser_callback_t = detail::parser_callback_t<basic_json>;
+
+    //////////////////
+    // constructors //
+    //////////////////
+
+    /// @name constructors and destructors
+    /// Constructors of class @ref basic_json, copy/move constructor, copy
+    /// assignment, static functions creating objects, and the destructor.
+    /// @{
+
+    /// @brief create an empty value with a given type
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    basic_json(const value_t v)
+        : m_type(v), m_value(v)
+    {
+        assert_invariant();
+    }
+
+    /// @brief create a null object
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    basic_json(std::nullptr_t = nullptr) noexcept
+        : basic_json(value_t::null)
+    {
+        assert_invariant();
+    }
+
+    /// @brief create a JSON value from compatible types
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    template < typename CompatibleType,
+               typename U = detail::uncvref_t<CompatibleType>,
+               detail::enable_if_t <
+                   !detail::is_basic_json<U>::value && detail::is_compatible_type<basic_json_t, U>::value, int > = 0 >
+    basic_json(CompatibleType && val) noexcept(noexcept( // NOLINT(bugprone-forwarding-reference-overload,bugprone-exception-escape)
+                JSONSerializer<U>::to_json(std::declval<basic_json_t&>(),
+                                           std::forward<CompatibleType>(val))))
+    {
+        JSONSerializer<U>::to_json(*this, std::forward<CompatibleType>(val));
+        set_parents();
+        assert_invariant();
+    }
+
+    /// @brief create a JSON value from an existing one
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    template < typename BasicJsonType,
+               detail::enable_if_t <
+                   detail::is_basic_json<BasicJsonType>::value&& !std::is_same<basic_json, BasicJsonType>::value, int > = 0 >
+    basic_json(const BasicJsonType& val)
+    {
+        using other_boolean_t = typename BasicJsonType::boolean_t;
+        using other_number_float_t = typename BasicJsonType::number_float_t;
+        using other_number_integer_t = typename BasicJsonType::number_integer_t;
+        using other_number_unsigned_t = typename BasicJsonType::number_unsigned_t;
+        using other_string_t = typename BasicJsonType::string_t;
+        using other_object_t = typename BasicJsonType::object_t;
+        using other_array_t = typename BasicJsonType::array_t;
+        using other_binary_t = typename BasicJsonType::binary_t;
+
+        switch (val.type())
+        {
+            case value_t::boolean:
+                JSONSerializer<other_boolean_t>::to_json(*this, val.template get<other_boolean_t>());
+                break;
+            case value_t::number_float:
+                JSONSerializer<other_number_float_t>::to_json(*this, val.template get<other_number_float_t>());
+                break;
+            case value_t::number_integer:
+                JSONSerializer<other_number_integer_t>::to_json(*this, val.template get<other_number_integer_t>());
+                break;
+            case value_t::number_unsigned:
+                JSONSerializer<other_number_unsigned_t>::to_json(*this, val.template get<other_number_unsigned_t>());
+                break;
+            case value_t::string:
+                JSONSerializer<other_string_t>::to_json(*this, val.template get_ref<const other_string_t&>());
+                break;
+            case value_t::object:
+                JSONSerializer<other_object_t>::to_json(*this, val.template get_ref<const other_object_t&>());
+                break;
+            case value_t::array:
+                JSONSerializer<other_array_t>::to_json(*this, val.template get_ref<const other_array_t&>());
+                break;
+            case value_t::binary:
+                JSONSerializer<other_binary_t>::to_json(*this, val.template get_ref<const other_binary_t&>());
+                break;
+            case value_t::null:
+                *this = nullptr;
+                break;
+            case value_t::discarded:
+                m_type = value_t::discarded;
+                break;
+            default:            // LCOV_EXCL_LINE
+                JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+        }
+        set_parents();
+        assert_invariant();
+    }
+
+    /// @brief create a container (array or object) from an initializer list
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    basic_json(initializer_list_t init,
+               bool type_deduction = true,
+               value_t manual_type = value_t::array)
+    {
+        // check if each element is an array with two elements whose first
+        // element is a string
+        bool is_an_object = std::all_of(init.begin(), init.end(),
+                                        [](const detail::json_ref<basic_json>& element_ref)
+        {
+            return element_ref->is_array() && element_ref->size() == 2 && (*element_ref)[0].is_string();
+        });
+
+        // adjust type if type deduction is not wanted
+        if (!type_deduction)
+        {
+            // if array is wanted, do not create an object though possible
+            if (manual_type == value_t::array)
+            {
+                is_an_object = false;
+            }
+
+            // if object is wanted but impossible, throw an exception
+            if (JSON_HEDLEY_UNLIKELY(manual_type == value_t::object && !is_an_object))
+            {
+                JSON_THROW(type_error::create(301, "cannot create object from initializer list", basic_json()));
+            }
+        }
+
+        if (is_an_object)
+        {
+            // the initializer list is a list of pairs -> create object
+            m_type = value_t::object;
+            m_value = value_t::object;
+
+            for (auto& element_ref : init)
+            {
+                auto element = element_ref.moved_or_copied();
+                m_value.object->emplace(
+                    std::move(*((*element.m_value.array)[0].m_value.string)),
+                    std::move((*element.m_value.array)[1]));
+            }
+        }
+        else
+        {
+            // the initializer list describes an array -> create array
+            m_type = value_t::array;
+            m_value.array = create<array_t>(init.begin(), init.end());
+        }
+
+        set_parents();
+        assert_invariant();
+    }
+
+    /// @brief explicitly create a binary array (without subtype)
+    /// @sa https://json.nlohmann.me/api/basic_json/binary/
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json binary(const typename binary_t::container_type& init)
+    {
+        auto res = basic_json();
+        res.m_type = value_t::binary;
+        res.m_value = init;
+        return res;
+    }
+
+    /// @brief explicitly create a binary array (with subtype)
+    /// @sa https://json.nlohmann.me/api/basic_json/binary/
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json binary(const typename binary_t::container_type& init, typename binary_t::subtype_type subtype)
+    {
+        auto res = basic_json();
+        res.m_type = value_t::binary;
+        res.m_value = binary_t(init, subtype);
+        return res;
+    }
+
+    /// @brief explicitly create a binary array
+    /// @sa https://json.nlohmann.me/api/basic_json/binary/
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json binary(typename binary_t::container_type&& init)
+    {
+        auto res = basic_json();
+        res.m_type = value_t::binary;
+        res.m_value = std::move(init);
+        return res;
+    }
+
+    /// @brief explicitly create a binary array (with subtype)
+    /// @sa https://json.nlohmann.me/api/basic_json/binary/
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json binary(typename binary_t::container_type&& init, typename binary_t::subtype_type subtype)
+    {
+        auto res = basic_json();
+        res.m_type = value_t::binary;
+        res.m_value = binary_t(std::move(init), subtype);
+        return res;
+    }
+
+    /// @brief explicitly create an array from an initializer list
+    /// @sa https://json.nlohmann.me/api/basic_json/array/
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json array(initializer_list_t init = {})
+    {
+        return basic_json(init, false, value_t::array);
+    }
+
+    /// @brief explicitly create an object from an initializer list
+    /// @sa https://json.nlohmann.me/api/basic_json/object/
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json object(initializer_list_t init = {})
+    {
+        return basic_json(init, false, value_t::object);
+    }
+
+    /// @brief construct an array with count copies of given value
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    basic_json(size_type cnt, const basic_json& val)
+        : m_type(value_t::array)
+    {
+        m_value.array = create<array_t>(cnt, val);
+        set_parents();
+        assert_invariant();
+    }
+
+    /// @brief construct a JSON container given an iterator range
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    template < class InputIT, typename std::enable_if <
+                   std::is_same<InputIT, typename basic_json_t::iterator>::value ||
+                   std::is_same<InputIT, typename basic_json_t::const_iterator>::value, int >::type = 0 >
+    basic_json(InputIT first, InputIT last)
+    {
+        JSON_ASSERT(first.m_object != nullptr);
+        JSON_ASSERT(last.m_object != nullptr);
+
+        // make sure iterator fits the current value
+        if (JSON_HEDLEY_UNLIKELY(first.m_object != last.m_object))
+        {
+            JSON_THROW(invalid_iterator::create(201, "iterators are not compatible", basic_json()));
+        }
+
+        // copy type from first iterator
+        m_type = first.m_object->m_type;
+
+        // check if iterator range is complete for primitive values
+        switch (m_type)
+        {
+            case value_t::boolean:
+            case value_t::number_float:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::string:
+            {
+                if (JSON_HEDLEY_UNLIKELY(!first.m_it.primitive_iterator.is_begin()
+                                         || !last.m_it.primitive_iterator.is_end()))
+                {
+                    JSON_THROW(invalid_iterator::create(204, "iterators out of range", *first.m_object));
+                }
+                break;
+            }
+
+            case value_t::null:
+            case value_t::object:
+            case value_t::array:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+                break;
+        }
+
+        switch (m_type)
+        {
+            case value_t::number_integer:
+            {
+                m_value.number_integer = first.m_object->m_value.number_integer;
+                break;
+            }
+
+            case value_t::number_unsigned:
+            {
+                m_value.number_unsigned = first.m_object->m_value.number_unsigned;
+                break;
+            }
+
+            case value_t::number_float:
+            {
+                m_value.number_float = first.m_object->m_value.number_float;
+                break;
+            }
+
+            case value_t::boolean:
+            {
+                m_value.boolean = first.m_object->m_value.boolean;
+                break;
+            }
+
+            case value_t::string:
+            {
+                m_value = *first.m_object->m_value.string;
+                break;
+            }
+
+            case value_t::object:
+            {
+                m_value.object = create<object_t>(first.m_it.object_iterator,
+                                                  last.m_it.object_iterator);
+                break;
+            }
+
+            case value_t::array:
+            {
+                m_value.array = create<array_t>(first.m_it.array_iterator,
+                                                last.m_it.array_iterator);
+                break;
+            }
+
+            case value_t::binary:
+            {
+                m_value = *first.m_object->m_value.binary;
+                break;
+            }
+
+            case value_t::null:
+            case value_t::discarded:
+            default:
+                JSON_THROW(invalid_iterator::create(206, "cannot construct with iterators from " + std::string(first.m_object->type_name()), *first.m_object));
+        }
+
+        set_parents();
+        assert_invariant();
+    }
+
+
+    ///////////////////////////////////////
+    // other constructors and destructor //
+    ///////////////////////////////////////
+
+    template<typename JsonRef,
+             detail::enable_if_t<detail::conjunction<detail::is_json_ref<JsonRef>,
+                                 std::is_same<typename JsonRef::value_type, basic_json>>::value, int> = 0 >
+    basic_json(const JsonRef& ref) : basic_json(ref.moved_or_copied()) {}
+
+    /// @brief copy constructor
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    basic_json(const basic_json& other)
+        : m_type(other.m_type)
+    {
+        // check of passed value is valid
+        other.assert_invariant();
+
+        switch (m_type)
+        {
+            case value_t::object:
+            {
+                m_value = *other.m_value.object;
+                break;
+            }
+
+            case value_t::array:
+            {
+                m_value = *other.m_value.array;
+                break;
+            }
+
+            case value_t::string:
+            {
+                m_value = *other.m_value.string;
+                break;
+            }
+
+            case value_t::boolean:
+            {
+                m_value = other.m_value.boolean;
+                break;
+            }
+
+            case value_t::number_integer:
+            {
+                m_value = other.m_value.number_integer;
+                break;
+            }
+
+            case value_t::number_unsigned:
+            {
+                m_value = other.m_value.number_unsigned;
+                break;
+            }
+
+            case value_t::number_float:
+            {
+                m_value = other.m_value.number_float;
+                break;
+            }
+
+            case value_t::binary:
+            {
+                m_value = *other.m_value.binary;
+                break;
+            }
+
+            case value_t::null:
+            case value_t::discarded:
+            default:
+                break;
+        }
+
+        set_parents();
+        assert_invariant();
+    }
+
+    /// @brief move constructor
+    /// @sa https://json.nlohmann.me/api/basic_json/basic_json/
+    basic_json(basic_json&& other) noexcept
+        : m_type(std::move(other.m_type)),
+          m_value(std::move(other.m_value))
+    {
+        // check that passed value is valid
+        other.assert_invariant(false);
+
+        // invalidate payload
+        other.m_type = value_t::null;
+        other.m_value = {};
+
+        set_parents();
+        assert_invariant();
+    }
+
+    /// @brief copy assignment
+    /// @sa https://json.nlohmann.me/api/basic_json/operator=/
+    basic_json& operator=(basic_json other) noexcept (
+        std::is_nothrow_move_constructible<value_t>::value&&
+        std::is_nothrow_move_assignable<value_t>::value&&
+        std::is_nothrow_move_constructible<json_value>::value&&
+        std::is_nothrow_move_assignable<json_value>::value
+    )
+    {
+        // check that passed value is valid
+        other.assert_invariant();
+
+        using std::swap;
+        swap(m_type, other.m_type);
+        swap(m_value, other.m_value);
+
+        set_parents();
+        assert_invariant();
+        return *this;
+    }
+
+    /// @brief destructor
+    /// @sa https://json.nlohmann.me/api/basic_json/~basic_json/
+    ~basic_json() noexcept
+    {
+        assert_invariant(false);
+        m_value.destroy(m_type);
+    }
+
+    /// @}
+
+  public:
+    ///////////////////////
+    // object inspection //
+    ///////////////////////
+
+    /// @name object inspection
+    /// Functions to inspect the type of a JSON value.
+    /// @{
+
+    /// @brief serialization
+    /// @sa https://json.nlohmann.me/api/basic_json/dump/
+    string_t dump(const int indent = -1,
+                  const char indent_char = ' ',
+                  const bool ensure_ascii = false,
+                  const error_handler_t error_handler = error_handler_t::strict) const
+    {
+        string_t result;
+        serializer s(detail::output_adapter<char, string_t>(result), indent_char, error_handler);
+
+        if (indent >= 0)
+        {
+            s.dump(*this, true, ensure_ascii, static_cast<unsigned int>(indent));
+        }
+        else
+        {
+            s.dump(*this, false, ensure_ascii, 0);
+        }
+
+        return result;
+    }
+
+    /// @brief return the type of the JSON value (explicit)
+    /// @sa https://json.nlohmann.me/api/basic_json/type/
+    constexpr value_t type() const noexcept
+    {
+        return m_type;
+    }
+
+    /// @brief return whether type is primitive
+    /// @sa https://json.nlohmann.me/api/basic_json/is_primitive/
+    constexpr bool is_primitive() const noexcept
+    {
+        return is_null() || is_string() || is_boolean() || is_number() || is_binary();
+    }
+
+    /// @brief return whether type is structured
+    /// @sa https://json.nlohmann.me/api/basic_json/is_structured/
+    constexpr bool is_structured() const noexcept
+    {
+        return is_array() || is_object();
+    }
+
+    /// @brief return whether value is null
+    /// @sa https://json.nlohmann.me/api/basic_json/is_null/
+    constexpr bool is_null() const noexcept
+    {
+        return m_type == value_t::null;
+    }
+
+    /// @brief return whether value is a boolean
+    /// @sa https://json.nlohmann.me/api/basic_json/is_boolean/
+    constexpr bool is_boolean() const noexcept
+    {
+        return m_type == value_t::boolean;
+    }
+
+    /// @brief return whether value is a number
+    /// @sa https://json.nlohmann.me/api/basic_json/is_number/
+    constexpr bool is_number() const noexcept
+    {
+        return is_number_integer() || is_number_float();
+    }
+
+    /// @brief return whether value is an integer number
+    /// @sa https://json.nlohmann.me/api/basic_json/is_number_integer/
+    constexpr bool is_number_integer() const noexcept
+    {
+        return m_type == value_t::number_integer || m_type == value_t::number_unsigned;
+    }
+
+    /// @brief return whether value is an unsigned integer number
+    /// @sa https://json.nlohmann.me/api/basic_json/is_number_unsigned/
+    constexpr bool is_number_unsigned() const noexcept
+    {
+        return m_type == value_t::number_unsigned;
+    }
+
+    /// @brief return whether value is a floating-point number
+    /// @sa https://json.nlohmann.me/api/basic_json/is_number_float/
+    constexpr bool is_number_float() const noexcept
+    {
+        return m_type == value_t::number_float;
+    }
+
+    /// @brief return whether value is an object
+    /// @sa https://json.nlohmann.me/api/basic_json/is_object/
+    constexpr bool is_object() const noexcept
+    {
+        return m_type == value_t::object;
+    }
+
+    /// @brief return whether value is an array
+    /// @sa https://json.nlohmann.me/api/basic_json/is_array/
+    constexpr bool is_array() const noexcept
+    {
+        return m_type == value_t::array;
+    }
+
+    /// @brief return whether value is a string
+    /// @sa https://json.nlohmann.me/api/basic_json/is_string/
+    constexpr bool is_string() const noexcept
+    {
+        return m_type == value_t::string;
+    }
+
+    /// @brief return whether value is a binary array
+    /// @sa https://json.nlohmann.me/api/basic_json/is_binary/
+    constexpr bool is_binary() const noexcept
+    {
+        return m_type == value_t::binary;
+    }
+
+    /// @brief return whether value is discarded
+    /// @sa https://json.nlohmann.me/api/basic_json/is_discarded/
+    constexpr bool is_discarded() const noexcept
+    {
+        return m_type == value_t::discarded;
+    }
+
+    /// @brief return the type of the JSON value (implicit)
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_value_t/
+    constexpr operator value_t() const noexcept
+    {
+        return m_type;
+    }
+
+    /// @}
+
+  private:
+    //////////////////
+    // value access //
+    //////////////////
+
+    /// get a boolean (explicit)
+    boolean_t get_impl(boolean_t* /*unused*/) const
+    {
+        if (JSON_HEDLEY_LIKELY(is_boolean()))
+        {
+            return m_value.boolean;
+        }
+
+        JSON_THROW(type_error::create(302, "type must be boolean, but is " + std::string(type_name()), *this));
+    }
+
+    /// get a pointer to the value (object)
+    object_t* get_impl_ptr(object_t* /*unused*/) noexcept
+    {
+        return is_object() ? m_value.object : nullptr;
+    }
+
+    /// get a pointer to the value (object)
+    constexpr const object_t* get_impl_ptr(const object_t* /*unused*/) const noexcept
+    {
+        return is_object() ? m_value.object : nullptr;
+    }
+
+    /// get a pointer to the value (array)
+    array_t* get_impl_ptr(array_t* /*unused*/) noexcept
+    {
+        return is_array() ? m_value.array : nullptr;
+    }
+
+    /// get a pointer to the value (array)
+    constexpr const array_t* get_impl_ptr(const array_t* /*unused*/) const noexcept
+    {
+        return is_array() ? m_value.array : nullptr;
+    }
+
+    /// get a pointer to the value (string)
+    string_t* get_impl_ptr(string_t* /*unused*/) noexcept
+    {
+        return is_string() ? m_value.string : nullptr;
+    }
+
+    /// get a pointer to the value (string)
+    constexpr const string_t* get_impl_ptr(const string_t* /*unused*/) const noexcept
+    {
+        return is_string() ? m_value.string : nullptr;
+    }
+
+    /// get a pointer to the value (boolean)
+    boolean_t* get_impl_ptr(boolean_t* /*unused*/) noexcept
+    {
+        return is_boolean() ? &m_value.boolean : nullptr;
+    }
+
+    /// get a pointer to the value (boolean)
+    constexpr const boolean_t* get_impl_ptr(const boolean_t* /*unused*/) const noexcept
+    {
+        return is_boolean() ? &m_value.boolean : nullptr;
+    }
+
+    /// get a pointer to the value (integer number)
+    number_integer_t* get_impl_ptr(number_integer_t* /*unused*/) noexcept
+    {
+        return is_number_integer() ? &m_value.number_integer : nullptr;
+    }
+
+    /// get a pointer to the value (integer number)
+    constexpr const number_integer_t* get_impl_ptr(const number_integer_t* /*unused*/) const noexcept
+    {
+        return is_number_integer() ? &m_value.number_integer : nullptr;
+    }
+
+    /// get a pointer to the value (unsigned number)
+    number_unsigned_t* get_impl_ptr(number_unsigned_t* /*unused*/) noexcept
+    {
+        return is_number_unsigned() ? &m_value.number_unsigned : nullptr;
+    }
+
+    /// get a pointer to the value (unsigned number)
+    constexpr const number_unsigned_t* get_impl_ptr(const number_unsigned_t* /*unused*/) const noexcept
+    {
+        return is_number_unsigned() ? &m_value.number_unsigned : nullptr;
+    }
+
+    /// get a pointer to the value (floating-point number)
+    number_float_t* get_impl_ptr(number_float_t* /*unused*/) noexcept
+    {
+        return is_number_float() ? &m_value.number_float : nullptr;
+    }
+
+    /// get a pointer to the value (floating-point number)
+    constexpr const number_float_t* get_impl_ptr(const number_float_t* /*unused*/) const noexcept
+    {
+        return is_number_float() ? &m_value.number_float : nullptr;
+    }
+
+    /// get a pointer to the value (binary)
+    binary_t* get_impl_ptr(binary_t* /*unused*/) noexcept
+    {
+        return is_binary() ? m_value.binary : nullptr;
+    }
+
+    /// get a pointer to the value (binary)
+    constexpr const binary_t* get_impl_ptr(const binary_t* /*unused*/) const noexcept
+    {
+        return is_binary() ? m_value.binary : nullptr;
+    }
+
+    /*!
+    @brief helper function to implement get_ref()
+
+    This function helps to implement get_ref() without code duplication for
+    const and non-const overloads
+
+    @tparam ThisType will be deduced as `basic_json` or `const basic_json`
+
+    @throw type_error.303 if ReferenceType does not match underlying value
+    type of the current JSON
+    */
+    template<typename ReferenceType, typename ThisType>
+    static ReferenceType get_ref_impl(ThisType& obj)
+    {
+        // delegate the call to get_ptr<>()
+        auto* ptr = obj.template get_ptr<typename std::add_pointer<ReferenceType>::type>();
+
+        if (JSON_HEDLEY_LIKELY(ptr != nullptr))
+        {
+            return *ptr;
+        }
+
+        JSON_THROW(type_error::create(303, "incompatible ReferenceType for get_ref, actual type is " + std::string(obj.type_name()), obj));
+    }
+
+  public:
+    /// @name value access
+    /// Direct access to the stored value of a JSON value.
+    /// @{
+
+    /// @brief get a pointer value (implicit)
+    /// @sa https://json.nlohmann.me/api/basic_json/get_ptr/
+    template<typename PointerType, typename std::enable_if<
+                 std::is_pointer<PointerType>::value, int>::type = 0>
+    auto get_ptr() noexcept -> decltype(std::declval<basic_json_t&>().get_impl_ptr(std::declval<PointerType>()))
+    {
+        // delegate the call to get_impl_ptr<>()
+        return get_impl_ptr(static_cast<PointerType>(nullptr));
+    }
+
+    /// @brief get a pointer value (implicit)
+    /// @sa https://json.nlohmann.me/api/basic_json/get_ptr/
+    template < typename PointerType, typename std::enable_if <
+                   std::is_pointer<PointerType>::value&&
+                   std::is_const<typename std::remove_pointer<PointerType>::type>::value, int >::type = 0 >
+    constexpr auto get_ptr() const noexcept -> decltype(std::declval<const basic_json_t&>().get_impl_ptr(std::declval<PointerType>()))
+    {
+        // delegate the call to get_impl_ptr<>() const
+        return get_impl_ptr(static_cast<PointerType>(nullptr));
+    }
+
+  private:
+    /*!
+    @brief get a value (explicit)
+
+    Explicit type conversion between the JSON value and a compatible value
+    which is [CopyConstructible](https://en.cppreference.com/w/cpp/named_req/CopyConstructible)
+    and [DefaultConstructible](https://en.cppreference.com/w/cpp/named_req/DefaultConstructible).
+    The value is converted by calling the @ref json_serializer<ValueType>
+    `from_json()` method.
+
+    The function is equivalent to executing
+    @code {.cpp}
+    ValueType ret;
+    JSONSerializer<ValueType>::from_json(*this, ret);
+    return ret;
+    @endcode
+
+    This overloads is chosen if:
+    - @a ValueType is not @ref basic_json,
+    - @ref json_serializer<ValueType> has a `from_json()` method of the form
+      `void from_json(const basic_json&, ValueType&)`, and
+    - @ref json_serializer<ValueType> does not have a `from_json()` method of
+      the form `ValueType from_json(const basic_json&)`
+
+    @tparam ValueType the returned value type
+
+    @return copy of the JSON value, converted to @a ValueType
+
+    @throw what @ref json_serializer<ValueType> `from_json()` method throws
+
+    @liveexample{The example below shows several conversions from JSON values
+    to other types. There a few things to note: (1) Floating-point numbers can
+    be converted to integers\, (2) A JSON array can be converted to a standard
+    `std::vector<short>`\, (3) A JSON object can be converted to C++
+    associative containers such as `std::unordered_map<std::string\,
+    json>`.,get__ValueType_const}
+
+    @since version 2.1.0
+    */
+    template < typename ValueType,
+               detail::enable_if_t <
+                   detail::is_default_constructible<ValueType>::value&&
+                   detail::has_from_json<basic_json_t, ValueType>::value,
+                   int > = 0 >
+    ValueType get_impl(detail::priority_tag<0> /*unused*/) const noexcept(noexcept(
+                JSONSerializer<ValueType>::from_json(std::declval<const basic_json_t&>(), std::declval<ValueType&>())))
+    {
+        auto ret = ValueType();
+        JSONSerializer<ValueType>::from_json(*this, ret);
+        return ret;
+    }
+
+    /*!
+    @brief get a value (explicit); special case
+
+    Explicit type conversion between the JSON value and a compatible value
+    which is **not** [CopyConstructible](https://en.cppreference.com/w/cpp/named_req/CopyConstructible)
+    and **not** [DefaultConstructible](https://en.cppreference.com/w/cpp/named_req/DefaultConstructible).
+    The value is converted by calling the @ref json_serializer<ValueType>
+    `from_json()` method.
+
+    The function is equivalent to executing
+    @code {.cpp}
+    return JSONSerializer<ValueType>::from_json(*this);
+    @endcode
+
+    This overloads is chosen if:
+    - @a ValueType is not @ref basic_json and
+    - @ref json_serializer<ValueType> has a `from_json()` method of the form
+      `ValueType from_json(const basic_json&)`
+
+    @note If @ref json_serializer<ValueType> has both overloads of
+    `from_json()`, this one is chosen.
+
+    @tparam ValueType the returned value type
+
+    @return copy of the JSON value, converted to @a ValueType
+
+    @throw what @ref json_serializer<ValueType> `from_json()` method throws
+
+    @since version 2.1.0
+    */
+    template < typename ValueType,
+               detail::enable_if_t <
+                   detail::has_non_default_from_json<basic_json_t, ValueType>::value,
+                   int > = 0 >
+    ValueType get_impl(detail::priority_tag<1> /*unused*/) const noexcept(noexcept(
+                JSONSerializer<ValueType>::from_json(std::declval<const basic_json_t&>())))
+    {
+        return JSONSerializer<ValueType>::from_json(*this);
+    }
+
+    /*!
+    @brief get special-case overload
+
+    This overloads converts the current @ref basic_json in a different
+    @ref basic_json type
+
+    @tparam BasicJsonType == @ref basic_json
+
+    @return a copy of *this, converted into @a BasicJsonType
+
+    @complexity Depending on the implementation of the called `from_json()`
+                method.
+
+    @since version 3.2.0
+    */
+    template < typename BasicJsonType,
+               detail::enable_if_t <
+                   detail::is_basic_json<BasicJsonType>::value,
+                   int > = 0 >
+    BasicJsonType get_impl(detail::priority_tag<2> /*unused*/) const
+    {
+        return *this;
+    }
+
+    /*!
+    @brief get special-case overload
+
+    This overloads avoids a lot of template boilerplate, it can be seen as the
+    identity method
+
+    @tparam BasicJsonType == @ref basic_json
+
+    @return a copy of *this
+
+    @complexity Constant.
+
+    @since version 2.1.0
+    */
+    template<typename BasicJsonType,
+             detail::enable_if_t<
+                 std::is_same<BasicJsonType, basic_json_t>::value,
+                 int> = 0>
+    basic_json get_impl(detail::priority_tag<3> /*unused*/) const
+    {
+        return *this;
+    }
+
+    /*!
+    @brief get a pointer value (explicit)
+    @copydoc get()
+    */
+    template<typename PointerType,
+             detail::enable_if_t<
+                 std::is_pointer<PointerType>::value,
+                 int> = 0>
+    constexpr auto get_impl(detail::priority_tag<4> /*unused*/) const noexcept
+    -> decltype(std::declval<const basic_json_t&>().template get_ptr<PointerType>())
+    {
+        // delegate the call to get_ptr
+        return get_ptr<PointerType>();
+    }
+
+  public:
+    /*!
+    @brief get a (pointer) value (explicit)
+
+    Performs explicit type conversion between the JSON value and a compatible value if required.
+
+    - If the requested type is a pointer to the internally stored JSON value that pointer is returned.
+    No copies are made.
+
+    - If the requested type is the current @ref basic_json, or a different @ref basic_json convertible
+    from the current @ref basic_json.
+
+    - Otherwise the value is converted by calling the @ref json_serializer<ValueType> `from_json()`
+    method.
+
+    @tparam ValueTypeCV the provided value type
+    @tparam ValueType the returned value type
+
+    @return copy of the JSON value, converted to @tparam ValueType if necessary
+
+    @throw what @ref json_serializer<ValueType> `from_json()` method throws if conversion is required
+
+    @since version 2.1.0
+    */
+    template < typename ValueTypeCV, typename ValueType = detail::uncvref_t<ValueTypeCV>>
+#if defined(JSON_HAS_CPP_14)
+    constexpr
+#endif
+    auto get() const noexcept(
+    noexcept(std::declval<const basic_json_t&>().template get_impl<ValueType>(detail::priority_tag<4> {})))
+    -> decltype(std::declval<const basic_json_t&>().template get_impl<ValueType>(detail::priority_tag<4> {}))
+    {
+        // we cannot static_assert on ValueTypeCV being non-const, because
+        // there is support for get<const basic_json_t>(), which is why we
+        // still need the uncvref
+        static_assert(!std::is_reference<ValueTypeCV>::value,
+                      "get() cannot be used with reference types, you might want to use get_ref()");
+        return get_impl<ValueType>(detail::priority_tag<4> {});
+    }
+
+    /*!
+    @brief get a pointer value (explicit)
+
+    Explicit pointer access to the internally stored JSON value. No copies are
+    made.
+
+    @warning The pointer becomes invalid if the underlying JSON object
+    changes.
+
+    @tparam PointerType pointer type; must be a pointer to @ref array_t, @ref
+    object_t, @ref string_t, @ref boolean_t, @ref number_integer_t,
+    @ref number_unsigned_t, or @ref number_float_t.
+
+    @return pointer to the internally stored JSON value if the requested
+    pointer type @a PointerType fits to the JSON value; `nullptr` otherwise
+
+    @complexity Constant.
+
+    @liveexample{The example below shows how pointers to internal values of a
+    JSON value can be requested. Note that no type conversions are made and a
+    `nullptr` is returned if the value and the requested pointer type does not
+    match.,get__PointerType}
+
+    @sa see @ref get_ptr() for explicit pointer-member access
+
+    @since version 1.0.0
+    */
+    template<typename PointerType, typename std::enable_if<
+                 std::is_pointer<PointerType>::value, int>::type = 0>
+    auto get() noexcept -> decltype(std::declval<basic_json_t&>().template get_ptr<PointerType>())
+    {
+        // delegate the call to get_ptr
+        return get_ptr<PointerType>();
+    }
+
+    /// @brief get a value (explicit)
+    /// @sa https://json.nlohmann.me/api/basic_json/get_to/
+    template < typename ValueType,
+               detail::enable_if_t <
+                   !detail::is_basic_json<ValueType>::value&&
+                   detail::has_from_json<basic_json_t, ValueType>::value,
+                   int > = 0 >
+    ValueType & get_to(ValueType& v) const noexcept(noexcept(
+                JSONSerializer<ValueType>::from_json(std::declval<const basic_json_t&>(), v)))
+    {
+        JSONSerializer<ValueType>::from_json(*this, v);
+        return v;
+    }
+
+    // specialization to allow calling get_to with a basic_json value
+    // see https://github.com/nlohmann/json/issues/2175
+    template<typename ValueType,
+             detail::enable_if_t <
+                 detail::is_basic_json<ValueType>::value,
+                 int> = 0>
+    ValueType & get_to(ValueType& v) const
+    {
+        v = *this;
+        return v;
+    }
+
+    template <
+        typename T, std::size_t N,
+        typename Array = T (&)[N], // NOLINT(cppcoreguidelines-avoid-c-arrays,hicpp-avoid-c-arrays,modernize-avoid-c-arrays)
+        detail::enable_if_t <
+            detail::has_from_json<basic_json_t, Array>::value, int > = 0 >
+    Array get_to(T (&v)[N]) const // NOLINT(cppcoreguidelines-avoid-c-arrays,hicpp-avoid-c-arrays,modernize-avoid-c-arrays)
+    noexcept(noexcept(JSONSerializer<Array>::from_json(
+                          std::declval<const basic_json_t&>(), v)))
+    {
+        JSONSerializer<Array>::from_json(*this, v);
+        return v;
+    }
+
+    /// @brief get a reference value (implicit)
+    /// @sa https://json.nlohmann.me/api/basic_json/get_ref/
+    template<typename ReferenceType, typename std::enable_if<
+                 std::is_reference<ReferenceType>::value, int>::type = 0>
+    ReferenceType get_ref()
+    {
+        // delegate call to get_ref_impl
+        return get_ref_impl<ReferenceType>(*this);
+    }
+
+    /// @brief get a reference value (implicit)
+    /// @sa https://json.nlohmann.me/api/basic_json/get_ref/
+    template < typename ReferenceType, typename std::enable_if <
+                   std::is_reference<ReferenceType>::value&&
+                   std::is_const<typename std::remove_reference<ReferenceType>::type>::value, int >::type = 0 >
+    ReferenceType get_ref() const
+    {
+        // delegate call to get_ref_impl
+        return get_ref_impl<ReferenceType>(*this);
+    }
+
+    /*!
+    @brief get a value (implicit)
+
+    Implicit type conversion between the JSON value and a compatible value.
+    The call is realized by calling @ref get() const.
+
+    @tparam ValueType non-pointer type compatible to the JSON value, for
+    instance `int` for JSON integer numbers, `bool` for JSON booleans, or
+    `std::vector` types for JSON arrays. The character type of @ref string_t
+    as well as an initializer list of this type is excluded to avoid
+    ambiguities as these types implicitly convert to `std::string`.
+
+    @return copy of the JSON value, converted to type @a ValueType
+
+    @throw type_error.302 in case passed type @a ValueType is incompatible
+    to the JSON value type (e.g., the JSON value is of type boolean, but a
+    string is requested); see example below
+
+    @complexity Linear in the size of the JSON value.
+
+    @liveexample{The example below shows several conversions from JSON values
+    to other types. There a few things to note: (1) Floating-point numbers can
+    be converted to integers\, (2) A JSON array can be converted to a standard
+    `std::vector<short>`\, (3) A JSON object can be converted to C++
+    associative containers such as `std::unordered_map<std::string\,
+    json>`.,operator__ValueType}
+
+    @since version 1.0.0
+    */
+    template < typename ValueType, typename std::enable_if <
+                   detail::conjunction <
+                       detail::negation<std::is_pointer<ValueType>>,
+                       detail::negation<std::is_same<ValueType, detail::json_ref<basic_json>>>,
+                                        detail::negation<std::is_same<ValueType, typename string_t::value_type>>,
+                                        detail::negation<detail::is_basic_json<ValueType>>,
+                                        detail::negation<std::is_same<ValueType, std::initializer_list<typename string_t::value_type>>>,
+
+#if defined(JSON_HAS_CPP_17) && (defined(__GNUC__) || (defined(_MSC_VER) && _MSC_VER >= 1910 && _MSC_VER <= 1914))
+                                                detail::negation<std::is_same<ValueType, std::string_view>>,
+#endif
+                                                detail::is_detected_lazy<detail::get_template_function, const basic_json_t&, ValueType>
+                                                >::value, int >::type = 0 >
+                                        JSON_EXPLICIT operator ValueType() const
+    {
+        // delegate the call to get<>() const
+        return get<ValueType>();
+    }
+
+    /// @brief get a binary value
+    /// @sa https://json.nlohmann.me/api/basic_json/get_binary/
+    binary_t& get_binary()
+    {
+        if (!is_binary())
+        {
+            JSON_THROW(type_error::create(302, "type must be binary, but is " + std::string(type_name()), *this));
+        }
+
+        return *get_ptr<binary_t*>();
+    }
+
+    /// @brief get a binary value
+    /// @sa https://json.nlohmann.me/api/basic_json/get_binary/
+    const binary_t& get_binary() const
+    {
+        if (!is_binary())
+        {
+            JSON_THROW(type_error::create(302, "type must be binary, but is " + std::string(type_name()), *this));
+        }
+
+        return *get_ptr<const binary_t*>();
+    }
+
+    /// @}
+
+
+    ////////////////////
+    // element access //
+    ////////////////////
+
+    /// @name element access
+    /// Access to the JSON value.
+    /// @{
+
+    /// @brief access specified array element with bounds checking
+    /// @sa https://json.nlohmann.me/api/basic_json/at/
+    reference at(size_type idx)
+    {
+        // at only works for arrays
+        if (JSON_HEDLEY_LIKELY(is_array()))
+        {
+            JSON_TRY
+            {
+                return set_parent(m_value.array->at(idx));
+            }
+            JSON_CATCH (std::out_of_range&)
+            {
+                // create better exception explanation
+                JSON_THROW(out_of_range::create(401, "array index " + std::to_string(idx) + " is out of range", *this));
+            }
+        }
+        else
+        {
+            JSON_THROW(type_error::create(304, "cannot use at() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @brief access specified array element with bounds checking
+    /// @sa https://json.nlohmann.me/api/basic_json/at/
+    const_reference at(size_type idx) const
+    {
+        // at only works for arrays
+        if (JSON_HEDLEY_LIKELY(is_array()))
+        {
+            JSON_TRY
+            {
+                return m_value.array->at(idx);
+            }
+            JSON_CATCH (std::out_of_range&)
+            {
+                // create better exception explanation
+                JSON_THROW(out_of_range::create(401, "array index " + std::to_string(idx) + " is out of range", *this));
+            }
+        }
+        else
+        {
+            JSON_THROW(type_error::create(304, "cannot use at() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @brief access specified object element with bounds checking
+    /// @sa https://json.nlohmann.me/api/basic_json/at/
+    reference at(const typename object_t::key_type& key)
+    {
+        // at only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            JSON_TRY
+            {
+                return set_parent(m_value.object->at(key));
+            }
+            JSON_CATCH (std::out_of_range&)
+            {
+                // create better exception explanation
+                JSON_THROW(out_of_range::create(403, "key '" + key + "' not found", *this));
+            }
+        }
+        else
+        {
+            JSON_THROW(type_error::create(304, "cannot use at() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @brief access specified object element with bounds checking
+    /// @sa https://json.nlohmann.me/api/basic_json/at/
+    const_reference at(const typename object_t::key_type& key) const
+    {
+        // at only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            JSON_TRY
+            {
+                return m_value.object->at(key);
+            }
+            JSON_CATCH (std::out_of_range&)
+            {
+                // create better exception explanation
+                JSON_THROW(out_of_range::create(403, "key '" + key + "' not found", *this));
+            }
+        }
+        else
+        {
+            JSON_THROW(type_error::create(304, "cannot use at() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @brief access specified array element
+    /// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
+    reference operator[](size_type idx)
+    {
+        // implicitly convert null value to an empty array
+        if (is_null())
+        {
+            m_type = value_t::array;
+            m_value.array = create<array_t>();
+            assert_invariant();
+        }
+
+        // operator[] only works for arrays
+        if (JSON_HEDLEY_LIKELY(is_array()))
+        {
+            // fill up array with null values if given idx is outside range
+            if (idx >= m_value.array->size())
+            {
+#if JSON_DIAGNOSTICS
+                // remember array size & capacity before resizing
+                const auto old_size = m_value.array->size();
+                const auto old_capacity = m_value.array->capacity();
+#endif
+                m_value.array->resize(idx + 1);
+
+#if JSON_DIAGNOSTICS
+                if (JSON_HEDLEY_UNLIKELY(m_value.array->capacity() != old_capacity))
+                {
+                    // capacity has changed: update all parents
+                    set_parents();
+                }
+                else
+                {
+                    // set parent for values added above
+                    set_parents(begin() + static_cast<typename iterator::difference_type>(old_size), static_cast<typename iterator::difference_type>(idx + 1 - old_size));
+                }
+#endif
+                assert_invariant();
+            }
+
+            return m_value.array->operator[](idx);
+        }
+
+        JSON_THROW(type_error::create(305, "cannot use operator[] with a numeric argument with " + std::string(type_name()), *this));
+    }
+
+    /// @brief access specified array element
+    /// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
+    const_reference operator[](size_type idx) const
+    {
+        // const operator[] only works for arrays
+        if (JSON_HEDLEY_LIKELY(is_array()))
+        {
+            return m_value.array->operator[](idx);
+        }
+
+        JSON_THROW(type_error::create(305, "cannot use operator[] with a numeric argument with " + std::string(type_name()), *this));
+    }
+
+    /// @brief access specified object element
+    /// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
+    reference operator[](const typename object_t::key_type& key)
+    {
+        // implicitly convert null value to an empty object
+        if (is_null())
+        {
+            m_type = value_t::object;
+            m_value.object = create<object_t>();
+            assert_invariant();
+        }
+
+        // operator[] only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            return set_parent(m_value.object->operator[](key));
+        }
+
+        JSON_THROW(type_error::create(305, "cannot use operator[] with a string argument with " + std::string(type_name()), *this));
+    }
+
+    /// @brief access specified object element
+    /// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
+    const_reference operator[](const typename object_t::key_type& key) const
+    {
+        // const operator[] only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            JSON_ASSERT(m_value.object->find(key) != m_value.object->end());
+            return m_value.object->find(key)->second;
+        }
+
+        JSON_THROW(type_error::create(305, "cannot use operator[] with a string argument with " + std::string(type_name()), *this));
+    }
+
+    /// @brief access specified object element
+    /// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
+    template<typename T>
+    JSON_HEDLEY_NON_NULL(2)
+    reference operator[](T* key)
+    {
+        // implicitly convert null to object
+        if (is_null())
+        {
+            m_type = value_t::object;
+            m_value = value_t::object;
+            assert_invariant();
+        }
+
+        // at only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            return set_parent(m_value.object->operator[](key));
+        }
+
+        JSON_THROW(type_error::create(305, "cannot use operator[] with a string argument with " + std::string(type_name()), *this));
+    }
+
+    /// @brief access specified object element
+    /// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
+    template<typename T>
+    JSON_HEDLEY_NON_NULL(2)
+    const_reference operator[](T* key) const
+    {
+        // at only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            JSON_ASSERT(m_value.object->find(key) != m_value.object->end());
+            return m_value.object->find(key)->second;
+        }
+
+        JSON_THROW(type_error::create(305, "cannot use operator[] with a string argument with " + std::string(type_name()), *this));
+    }
+
+    /// @brief access specified object element with default value
+    /// @sa https://json.nlohmann.me/api/basic_json/value/
+    /// using std::is_convertible in a std::enable_if will fail when using explicit conversions
+    template < class ValueType, typename std::enable_if <
+                   detail::is_getable<basic_json_t, ValueType>::value
+                   && !std::is_same<value_t, ValueType>::value, int >::type = 0 >
+    ValueType value(const typename object_t::key_type& key, const ValueType& default_value) const
+    {
+        // at only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            // if key is found, return value and given default value otherwise
+            const auto it = find(key);
+            if (it != end())
+            {
+                return it->template get<ValueType>();
+            }
+
+            return default_value;
+        }
+
+        JSON_THROW(type_error::create(306, "cannot use value() with " + std::string(type_name()), *this));
+    }
+
+    /// @brief access specified object element with default value
+    /// @sa https://json.nlohmann.me/api/basic_json/value/
+    /// overload for a default value of type const char*
+    string_t value(const typename object_t::key_type& key, const char* default_value) const
+    {
+        return value(key, string_t(default_value));
+    }
+
+    /// @brief access specified object element via JSON Pointer with default value
+    /// @sa https://json.nlohmann.me/api/basic_json/value/
+    template<class ValueType, typename std::enable_if<
+                 detail::is_getable<basic_json_t, ValueType>::value, int>::type = 0>
+    ValueType value(const json_pointer& ptr, const ValueType& default_value) const
+    {
+        // at only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            // if pointer resolves a value, return it or use default value
+            JSON_TRY
+            {
+                return ptr.get_checked(this).template get<ValueType>();
+            }
+            JSON_INTERNAL_CATCH (out_of_range&)
+            {
+                return default_value;
+            }
+        }
+
+        JSON_THROW(type_error::create(306, "cannot use value() with " + std::string(type_name()), *this));
+    }
+
+    /// @brief access specified object element via JSON Pointer with default value
+    /// @sa https://json.nlohmann.me/api/basic_json/value/
+    /// overload for a default value of type const char*
+    JSON_HEDLEY_NON_NULL(3)
+    string_t value(const json_pointer& ptr, const char* default_value) const
+    {
+        return value(ptr, string_t(default_value));
+    }
+
+    /// @brief access the first element
+    /// @sa https://json.nlohmann.me/api/basic_json/front/
+    reference front()
+    {
+        return *begin();
+    }
+
+    /// @brief access the first element
+    /// @sa https://json.nlohmann.me/api/basic_json/front/
+    const_reference front() const
+    {
+        return *cbegin();
+    }
+
+    /// @brief access the last element
+    /// @sa https://json.nlohmann.me/api/basic_json/back/
+    reference back()
+    {
+        auto tmp = end();
+        --tmp;
+        return *tmp;
+    }
+
+    /// @brief access the last element
+    /// @sa https://json.nlohmann.me/api/basic_json/back/
+    const_reference back() const
+    {
+        auto tmp = cend();
+        --tmp;
+        return *tmp;
+    }
+
+    /// @brief remove element given an iterator
+    /// @sa https://json.nlohmann.me/api/basic_json/erase/
+    template < class IteratorType, typename std::enable_if <
+                   std::is_same<IteratorType, typename basic_json_t::iterator>::value ||
+                   std::is_same<IteratorType, typename basic_json_t::const_iterator>::value, int >::type
+               = 0 >
+    IteratorType erase(IteratorType pos)
+    {
+        // make sure iterator fits the current value
+        if (JSON_HEDLEY_UNLIKELY(this != pos.m_object))
+        {
+            JSON_THROW(invalid_iterator::create(202, "iterator does not fit current value", *this));
+        }
+
+        IteratorType result = end();
+
+        switch (m_type)
+        {
+            case value_t::boolean:
+            case value_t::number_float:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::string:
+            case value_t::binary:
+            {
+                if (JSON_HEDLEY_UNLIKELY(!pos.m_it.primitive_iterator.is_begin()))
+                {
+                    JSON_THROW(invalid_iterator::create(205, "iterator out of range", *this));
+                }
+
+                if (is_string())
+                {
+                    AllocatorType<string_t> alloc;
+                    std::allocator_traits<decltype(alloc)>::destroy(alloc, m_value.string);
+                    std::allocator_traits<decltype(alloc)>::deallocate(alloc, m_value.string, 1);
+                    m_value.string = nullptr;
+                }
+                else if (is_binary())
+                {
+                    AllocatorType<binary_t> alloc;
+                    std::allocator_traits<decltype(alloc)>::destroy(alloc, m_value.binary);
+                    std::allocator_traits<decltype(alloc)>::deallocate(alloc, m_value.binary, 1);
+                    m_value.binary = nullptr;
+                }
+
+                m_type = value_t::null;
+                assert_invariant();
+                break;
+            }
+
+            case value_t::object:
+            {
+                result.m_it.object_iterator = m_value.object->erase(pos.m_it.object_iterator);
+                break;
+            }
+
+            case value_t::array:
+            {
+                result.m_it.array_iterator = m_value.array->erase(pos.m_it.array_iterator);
+                break;
+            }
+
+            case value_t::null:
+            case value_t::discarded:
+            default:
+                JSON_THROW(type_error::create(307, "cannot use erase() with " + std::string(type_name()), *this));
+        }
+
+        return result;
+    }
+
+    /// @brief remove elements given an iterator range
+    /// @sa https://json.nlohmann.me/api/basic_json/erase/
+    template < class IteratorType, typename std::enable_if <
+                   std::is_same<IteratorType, typename basic_json_t::iterator>::value ||
+                   std::is_same<IteratorType, typename basic_json_t::const_iterator>::value, int >::type
+               = 0 >
+    IteratorType erase(IteratorType first, IteratorType last)
+    {
+        // make sure iterator fits the current value
+        if (JSON_HEDLEY_UNLIKELY(this != first.m_object || this != last.m_object))
+        {
+            JSON_THROW(invalid_iterator::create(203, "iterators do not fit current value", *this));
+        }
+
+        IteratorType result = end();
+
+        switch (m_type)
+        {
+            case value_t::boolean:
+            case value_t::number_float:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::string:
+            case value_t::binary:
+            {
+                if (JSON_HEDLEY_LIKELY(!first.m_it.primitive_iterator.is_begin()
+                                       || !last.m_it.primitive_iterator.is_end()))
+                {
+                    JSON_THROW(invalid_iterator::create(204, "iterators out of range", *this));
+                }
+
+                if (is_string())
+                {
+                    AllocatorType<string_t> alloc;
+                    std::allocator_traits<decltype(alloc)>::destroy(alloc, m_value.string);
+                    std::allocator_traits<decltype(alloc)>::deallocate(alloc, m_value.string, 1);
+                    m_value.string = nullptr;
+                }
+                else if (is_binary())
+                {
+                    AllocatorType<binary_t> alloc;
+                    std::allocator_traits<decltype(alloc)>::destroy(alloc, m_value.binary);
+                    std::allocator_traits<decltype(alloc)>::deallocate(alloc, m_value.binary, 1);
+                    m_value.binary = nullptr;
+                }
+
+                m_type = value_t::null;
+                assert_invariant();
+                break;
+            }
+
+            case value_t::object:
+            {
+                result.m_it.object_iterator = m_value.object->erase(first.m_it.object_iterator,
+                                              last.m_it.object_iterator);
+                break;
+            }
+
+            case value_t::array:
+            {
+                result.m_it.array_iterator = m_value.array->erase(first.m_it.array_iterator,
+                                             last.m_it.array_iterator);
+                break;
+            }
+
+            case value_t::null:
+            case value_t::discarded:
+            default:
+                JSON_THROW(type_error::create(307, "cannot use erase() with " + std::string(type_name()), *this));
+        }
+
+        return result;
+    }
+
+    /// @brief remove element from a JSON object given a key
+    /// @sa https://json.nlohmann.me/api/basic_json/erase/
+    size_type erase(const typename object_t::key_type& key)
+    {
+        // this erase only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            return m_value.object->erase(key);
+        }
+
+        JSON_THROW(type_error::create(307, "cannot use erase() with " + std::string(type_name()), *this));
+    }
+
+    /// @brief remove element from a JSON array given an index
+    /// @sa https://json.nlohmann.me/api/basic_json/erase/
+    void erase(const size_type idx)
+    {
+        // this erase only works for arrays
+        if (JSON_HEDLEY_LIKELY(is_array()))
+        {
+            if (JSON_HEDLEY_UNLIKELY(idx >= size()))
+            {
+                JSON_THROW(out_of_range::create(401, "array index " + std::to_string(idx) + " is out of range", *this));
+            }
+
+            m_value.array->erase(m_value.array->begin() + static_cast<difference_type>(idx));
+        }
+        else
+        {
+            JSON_THROW(type_error::create(307, "cannot use erase() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @}
+
+
+    ////////////
+    // lookup //
+    ////////////
+
+    /// @name lookup
+    /// @{
+
+    /// @brief find an element in a JSON object
+    /// @sa https://json.nlohmann.me/api/basic_json/find/
+    template<typename KeyT>
+    iterator find(KeyT&& key)
+    {
+        auto result = end();
+
+        if (is_object())
+        {
+            result.m_it.object_iterator = m_value.object->find(std::forward<KeyT>(key));
+        }
+
+        return result;
+    }
+
+    /// @brief find an element in a JSON object
+    /// @sa https://json.nlohmann.me/api/basic_json/find/
+    template<typename KeyT>
+    const_iterator find(KeyT&& key) const
+    {
+        auto result = cend();
+
+        if (is_object())
+        {
+            result.m_it.object_iterator = m_value.object->find(std::forward<KeyT>(key));
+        }
+
+        return result;
+    }
+
+    /// @brief returns the number of occurrences of a key in a JSON object
+    /// @sa https://json.nlohmann.me/api/basic_json/count/
+    template<typename KeyT>
+    size_type count(KeyT&& key) const
+    {
+        // return 0 for all nonobject types
+        return is_object() ? m_value.object->count(std::forward<KeyT>(key)) : 0;
+    }
+
+    /// @brief check the existence of an element in a JSON object
+    /// @sa https://json.nlohmann.me/api/basic_json/contains/
+    template < typename KeyT, typename std::enable_if <
+                   !std::is_same<typename std::decay<KeyT>::type, json_pointer>::value, int >::type = 0 >
+    bool contains(KeyT && key) const
+    {
+        return is_object() && m_value.object->find(std::forward<KeyT>(key)) != m_value.object->end();
+    }
+
+    /// @brief check the existence of an element in a JSON object given a JSON pointer
+    /// @sa https://json.nlohmann.me/api/basic_json/contains/
+    bool contains(const json_pointer& ptr) const
+    {
+        return ptr.contains(this);
+    }
+
+    /// @}
+
+
+    ///////////////
+    // iterators //
+    ///////////////
+
+    /// @name iterators
+    /// @{
+
+    /// @brief returns an iterator to the first element
+    /// @sa https://json.nlohmann.me/api/basic_json/begin/
+    iterator begin() noexcept
+    {
+        iterator result(this);
+        result.set_begin();
+        return result;
+    }
+
+    /// @brief returns an iterator to the first element
+    /// @sa https://json.nlohmann.me/api/basic_json/begin/
+    const_iterator begin() const noexcept
+    {
+        return cbegin();
+    }
+
+    /// @brief returns a const iterator to the first element
+    /// @sa https://json.nlohmann.me/api/basic_json/cbegin/
+    const_iterator cbegin() const noexcept
+    {
+        const_iterator result(this);
+        result.set_begin();
+        return result;
+    }
+
+    /// @brief returns an iterator to one past the last element
+    /// @sa https://json.nlohmann.me/api/basic_json/end/
+    iterator end() noexcept
+    {
+        iterator result(this);
+        result.set_end();
+        return result;
+    }
+
+    /// @brief returns an iterator to one past the last element
+    /// @sa https://json.nlohmann.me/api/basic_json/end/
+    const_iterator end() const noexcept
+    {
+        return cend();
+    }
+
+    /// @brief returns an iterator to one past the last element
+    /// @sa https://json.nlohmann.me/api/basic_json/cend/
+    const_iterator cend() const noexcept
+    {
+        const_iterator result(this);
+        result.set_end();
+        return result;
+    }
+
+    /// @brief returns an iterator to the reverse-beginning
+    /// @sa https://json.nlohmann.me/api/basic_json/rbegin/
+    reverse_iterator rbegin() noexcept
+    {
+        return reverse_iterator(end());
+    }
+
+    /// @brief returns an iterator to the reverse-beginning
+    /// @sa https://json.nlohmann.me/api/basic_json/rbegin/
+    const_reverse_iterator rbegin() const noexcept
+    {
+        return crbegin();
+    }
+
+    /// @brief returns an iterator to the reverse-end
+    /// @sa https://json.nlohmann.me/api/basic_json/rend/
+    reverse_iterator rend() noexcept
+    {
+        return reverse_iterator(begin());
+    }
+
+    /// @brief returns an iterator to the reverse-end
+    /// @sa https://json.nlohmann.me/api/basic_json/rend/
+    const_reverse_iterator rend() const noexcept
+    {
+        return crend();
+    }
+
+    /// @brief returns a const reverse iterator to the last element
+    /// @sa https://json.nlohmann.me/api/basic_json/crbegin/
+    const_reverse_iterator crbegin() const noexcept
+    {
+        return const_reverse_iterator(cend());
+    }
+
+    /// @brief returns a const reverse iterator to one before the first
+    /// @sa https://json.nlohmann.me/api/basic_json/crend/
+    const_reverse_iterator crend() const noexcept
+    {
+        return const_reverse_iterator(cbegin());
+    }
+
+  public:
+    /// @brief wrapper to access iterator member functions in range-based for
+    /// @sa https://json.nlohmann.me/api/basic_json/items/
+    /// @deprecated This function is deprecated since 3.1.0 and will be removed in
+    ///             version 4.0.0 of the library. Please use @ref items() instead;
+    ///             that is, replace `json::iterator_wrapper(j)` with `j.items()`.
+    JSON_HEDLEY_DEPRECATED_FOR(3.1.0, items())
+    static iteration_proxy<iterator> iterator_wrapper(reference ref) noexcept
+    {
+        return ref.items();
+    }
+
+    /// @brief wrapper to access iterator member functions in range-based for
+    /// @sa https://json.nlohmann.me/api/basic_json/items/
+    /// @deprecated This function is deprecated since 3.1.0 and will be removed in
+    ///         version 4.0.0 of the library. Please use @ref items() instead;
+    ///         that is, replace `json::iterator_wrapper(j)` with `j.items()`.
+    JSON_HEDLEY_DEPRECATED_FOR(3.1.0, items())
+    static iteration_proxy<const_iterator> iterator_wrapper(const_reference ref) noexcept
+    {
+        return ref.items();
+    }
+
+    /// @brief helper to access iterator member functions in range-based for
+    /// @sa https://json.nlohmann.me/api/basic_json/items/
+    iteration_proxy<iterator> items() noexcept
+    {
+        return iteration_proxy<iterator>(*this);
+    }
+
+    /// @brief helper to access iterator member functions in range-based for
+    /// @sa https://json.nlohmann.me/api/basic_json/items/
+    iteration_proxy<const_iterator> items() const noexcept
+    {
+        return iteration_proxy<const_iterator>(*this);
+    }
+
+    /// @}
+
+
+    //////////////
+    // capacity //
+    //////////////
+
+    /// @name capacity
+    /// @{
+
+    /// @brief checks whether the container is empty.
+    /// @sa https://json.nlohmann.me/api/basic_json/empty/
+    bool empty() const noexcept
+    {
+        switch (m_type)
+        {
+            case value_t::null:
+            {
+                // null values are empty
+                return true;
+            }
+
+            case value_t::array:
+            {
+                // delegate call to array_t::empty()
+                return m_value.array->empty();
+            }
+
+            case value_t::object:
+            {
+                // delegate call to object_t::empty()
+                return m_value.object->empty();
+            }
+
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                // all other types are nonempty
+                return false;
+            }
+        }
+    }
+
+    /// @brief returns the number of elements
+    /// @sa https://json.nlohmann.me/api/basic_json/size/
+    size_type size() const noexcept
+    {
+        switch (m_type)
+        {
+            case value_t::null:
+            {
+                // null values are empty
+                return 0;
+            }
+
+            case value_t::array:
+            {
+                // delegate call to array_t::size()
+                return m_value.array->size();
+            }
+
+            case value_t::object:
+            {
+                // delegate call to object_t::size()
+                return m_value.object->size();
+            }
+
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                // all other types have size 1
+                return 1;
+            }
+        }
+    }
+
+    /// @brief returns the maximum possible number of elements
+    /// @sa https://json.nlohmann.me/api/basic_json/max_size/
+    size_type max_size() const noexcept
+    {
+        switch (m_type)
+        {
+            case value_t::array:
+            {
+                // delegate call to array_t::max_size()
+                return m_value.array->max_size();
+            }
+
+            case value_t::object:
+            {
+                // delegate call to object_t::max_size()
+                return m_value.object->max_size();
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                // all other types have max_size() == size()
+                return size();
+            }
+        }
+    }
+
+    /// @}
+
+
+    ///////////////
+    // modifiers //
+    ///////////////
+
+    /// @name modifiers
+    /// @{
+
+    /// @brief clears the contents
+    /// @sa https://json.nlohmann.me/api/basic_json/clear/
+    void clear() noexcept
+    {
+        switch (m_type)
+        {
+            case value_t::number_integer:
+            {
+                m_value.number_integer = 0;
+                break;
+            }
+
+            case value_t::number_unsigned:
+            {
+                m_value.number_unsigned = 0;
+                break;
+            }
+
+            case value_t::number_float:
+            {
+                m_value.number_float = 0.0;
+                break;
+            }
+
+            case value_t::boolean:
+            {
+                m_value.boolean = false;
+                break;
+            }
+
+            case value_t::string:
+            {
+                m_value.string->clear();
+                break;
+            }
+
+            case value_t::binary:
+            {
+                m_value.binary->clear();
+                break;
+            }
+
+            case value_t::array:
+            {
+                m_value.array->clear();
+                break;
+            }
+
+            case value_t::object:
+            {
+                m_value.object->clear();
+                break;
+            }
+
+            case value_t::null:
+            case value_t::discarded:
+            default:
+                break;
+        }
+    }
+
+    /// @brief add an object to an array
+    /// @sa https://json.nlohmann.me/api/basic_json/push_back/
+    void push_back(basic_json&& val)
+    {
+        // push_back only works for null objects or arrays
+        if (JSON_HEDLEY_UNLIKELY(!(is_null() || is_array())))
+        {
+            JSON_THROW(type_error::create(308, "cannot use push_back() with " + std::string(type_name()), *this));
+        }
+
+        // transform null object into an array
+        if (is_null())
+        {
+            m_type = value_t::array;
+            m_value = value_t::array;
+            assert_invariant();
+        }
+
+        // add element to array (move semantics)
+        const auto old_capacity = m_value.array->capacity();
+        m_value.array->push_back(std::move(val));
+        set_parent(m_value.array->back(), old_capacity);
+        // if val is moved from, basic_json move constructor marks it null, so we do not call the destructor
+    }
+
+    /// @brief add an object to an array
+    /// @sa https://json.nlohmann.me/api/basic_json/operator+=/
+    reference operator+=(basic_json&& val)
+    {
+        push_back(std::move(val));
+        return *this;
+    }
+
+    /// @brief add an object to an array
+    /// @sa https://json.nlohmann.me/api/basic_json/push_back/
+    void push_back(const basic_json& val)
+    {
+        // push_back only works for null objects or arrays
+        if (JSON_HEDLEY_UNLIKELY(!(is_null() || is_array())))
+        {
+            JSON_THROW(type_error::create(308, "cannot use push_back() with " + std::string(type_name()), *this));
+        }
+
+        // transform null object into an array
+        if (is_null())
+        {
+            m_type = value_t::array;
+            m_value = value_t::array;
+            assert_invariant();
+        }
+
+        // add element to array
+        const auto old_capacity = m_value.array->capacity();
+        m_value.array->push_back(val);
+        set_parent(m_value.array->back(), old_capacity);
+    }
+
+    /// @brief add an object to an array
+    /// @sa https://json.nlohmann.me/api/basic_json/operator+=/
+    reference operator+=(const basic_json& val)
+    {
+        push_back(val);
+        return *this;
+    }
+
+    /// @brief add an object to an object
+    /// @sa https://json.nlohmann.me/api/basic_json/push_back/
+    void push_back(const typename object_t::value_type& val)
+    {
+        // push_back only works for null objects or objects
+        if (JSON_HEDLEY_UNLIKELY(!(is_null() || is_object())))
+        {
+            JSON_THROW(type_error::create(308, "cannot use push_back() with " + std::string(type_name()), *this));
+        }
+
+        // transform null object into an object
+        if (is_null())
+        {
+            m_type = value_t::object;
+            m_value = value_t::object;
+            assert_invariant();
+        }
+
+        // add element to object
+        auto res = m_value.object->insert(val);
+        set_parent(res.first->second);
+    }
+
+    /// @brief add an object to an object
+    /// @sa https://json.nlohmann.me/api/basic_json/operator+=/
+    reference operator+=(const typename object_t::value_type& val)
+    {
+        push_back(val);
+        return *this;
+    }
+
+    /// @brief add an object to an object
+    /// @sa https://json.nlohmann.me/api/basic_json/push_back/
+    void push_back(initializer_list_t init)
+    {
+        if (is_object() && init.size() == 2 && (*init.begin())->is_string())
+        {
+            basic_json&& key = init.begin()->moved_or_copied();
+            push_back(typename object_t::value_type(
+                          std::move(key.get_ref<string_t&>()), (init.begin() + 1)->moved_or_copied()));
+        }
+        else
+        {
+            push_back(basic_json(init));
+        }
+    }
+
+    /// @brief add an object to an object
+    /// @sa https://json.nlohmann.me/api/basic_json/operator+=/
+    reference operator+=(initializer_list_t init)
+    {
+        push_back(init);
+        return *this;
+    }
+
+    /// @brief add an object to an array
+    /// @sa https://json.nlohmann.me/api/basic_json/emplace_back/
+    template<class... Args>
+    reference emplace_back(Args&& ... args)
+    {
+        // emplace_back only works for null objects or arrays
+        if (JSON_HEDLEY_UNLIKELY(!(is_null() || is_array())))
+        {
+            JSON_THROW(type_error::create(311, "cannot use emplace_back() with " + std::string(type_name()), *this));
+        }
+
+        // transform null object into an array
+        if (is_null())
+        {
+            m_type = value_t::array;
+            m_value = value_t::array;
+            assert_invariant();
+        }
+
+        // add element to array (perfect forwarding)
+        const auto old_capacity = m_value.array->capacity();
+        m_value.array->emplace_back(std::forward<Args>(args)...);
+        return set_parent(m_value.array->back(), old_capacity);
+    }
+
+    /// @brief add an object to an object if key does not exist
+    /// @sa https://json.nlohmann.me/api/basic_json/emplace/
+    template<class... Args>
+    std::pair<iterator, bool> emplace(Args&& ... args)
+    {
+        // emplace only works for null objects or arrays
+        if (JSON_HEDLEY_UNLIKELY(!(is_null() || is_object())))
+        {
+            JSON_THROW(type_error::create(311, "cannot use emplace() with " + std::string(type_name()), *this));
+        }
+
+        // transform null object into an object
+        if (is_null())
+        {
+            m_type = value_t::object;
+            m_value = value_t::object;
+            assert_invariant();
+        }
+
+        // add element to array (perfect forwarding)
+        auto res = m_value.object->emplace(std::forward<Args>(args)...);
+        set_parent(res.first->second);
+
+        // create result iterator and set iterator to the result of emplace
+        auto it = begin();
+        it.m_it.object_iterator = res.first;
+
+        // return pair of iterator and boolean
+        return {it, res.second};
+    }
+
+    /// Helper for insertion of an iterator
+    /// @note: This uses std::distance to support GCC 4.8,
+    ///        see https://github.com/nlohmann/json/pull/1257
+    template<typename... Args>
+    iterator insert_iterator(const_iterator pos, Args&& ... args)
+    {
+        iterator result(this);
+        JSON_ASSERT(m_value.array != nullptr);
+
+        auto insert_pos = std::distance(m_value.array->begin(), pos.m_it.array_iterator);
+        m_value.array->insert(pos.m_it.array_iterator, std::forward<Args>(args)...);
+        result.m_it.array_iterator = m_value.array->begin() + insert_pos;
+
+        // This could have been written as:
+        // result.m_it.array_iterator = m_value.array->insert(pos.m_it.array_iterator, cnt, val);
+        // but the return value of insert is missing in GCC 4.8, so it is written this way instead.
+
+        set_parents();
+        return result;
+    }
+
+    /// @brief inserts element into array
+    /// @sa https://json.nlohmann.me/api/basic_json/insert/
+    iterator insert(const_iterator pos, const basic_json& val)
+    {
+        // insert only works for arrays
+        if (JSON_HEDLEY_LIKELY(is_array()))
+        {
+            // check if iterator pos fits to this JSON value
+            if (JSON_HEDLEY_UNLIKELY(pos.m_object != this))
+            {
+                JSON_THROW(invalid_iterator::create(202, "iterator does not fit current value", *this));
+            }
+
+            // insert to array and return iterator
+            return insert_iterator(pos, val);
+        }
+
+        JSON_THROW(type_error::create(309, "cannot use insert() with " + std::string(type_name()), *this));
+    }
+
+    /// @brief inserts element into array
+    /// @sa https://json.nlohmann.me/api/basic_json/insert/
+    iterator insert(const_iterator pos, basic_json&& val)
+    {
+        return insert(pos, val);
+    }
+
+    /// @brief inserts copies of element into array
+    /// @sa https://json.nlohmann.me/api/basic_json/insert/
+    iterator insert(const_iterator pos, size_type cnt, const basic_json& val)
+    {
+        // insert only works for arrays
+        if (JSON_HEDLEY_LIKELY(is_array()))
+        {
+            // check if iterator pos fits to this JSON value
+            if (JSON_HEDLEY_UNLIKELY(pos.m_object != this))
+            {
+                JSON_THROW(invalid_iterator::create(202, "iterator does not fit current value", *this));
+            }
+
+            // insert to array and return iterator
+            return insert_iterator(pos, cnt, val);
+        }
+
+        JSON_THROW(type_error::create(309, "cannot use insert() with " + std::string(type_name()), *this));
+    }
+
+    /// @brief inserts range of elements into array
+    /// @sa https://json.nlohmann.me/api/basic_json/insert/
+    iterator insert(const_iterator pos, const_iterator first, const_iterator last)
+    {
+        // insert only works for arrays
+        if (JSON_HEDLEY_UNLIKELY(!is_array()))
+        {
+            JSON_THROW(type_error::create(309, "cannot use insert() with " + std::string(type_name()), *this));
+        }
+
+        // check if iterator pos fits to this JSON value
+        if (JSON_HEDLEY_UNLIKELY(pos.m_object != this))
+        {
+            JSON_THROW(invalid_iterator::create(202, "iterator does not fit current value", *this));
+        }
+
+        // check if range iterators belong to the same JSON object
+        if (JSON_HEDLEY_UNLIKELY(first.m_object != last.m_object))
+        {
+            JSON_THROW(invalid_iterator::create(210, "iterators do not fit", *this));
+        }
+
+        if (JSON_HEDLEY_UNLIKELY(first.m_object == this))
+        {
+            JSON_THROW(invalid_iterator::create(211, "passed iterators may not belong to container", *this));
+        }
+
+        // insert to array and return iterator
+        return insert_iterator(pos, first.m_it.array_iterator, last.m_it.array_iterator);
+    }
+
+    /// @brief inserts elements from initializer list into array
+    /// @sa https://json.nlohmann.me/api/basic_json/insert/
+    iterator insert(const_iterator pos, initializer_list_t ilist)
+    {
+        // insert only works for arrays
+        if (JSON_HEDLEY_UNLIKELY(!is_array()))
+        {
+            JSON_THROW(type_error::create(309, "cannot use insert() with " + std::string(type_name()), *this));
+        }
+
+        // check if iterator pos fits to this JSON value
+        if (JSON_HEDLEY_UNLIKELY(pos.m_object != this))
+        {
+            JSON_THROW(invalid_iterator::create(202, "iterator does not fit current value", *this));
+        }
+
+        // insert to array and return iterator
+        return insert_iterator(pos, ilist.begin(), ilist.end());
+    }
+
+    /// @brief inserts range of elements into object
+    /// @sa https://json.nlohmann.me/api/basic_json/insert/
+    void insert(const_iterator first, const_iterator last)
+    {
+        // insert only works for objects
+        if (JSON_HEDLEY_UNLIKELY(!is_object()))
+        {
+            JSON_THROW(type_error::create(309, "cannot use insert() with " + std::string(type_name()), *this));
+        }
+
+        // check if range iterators belong to the same JSON object
+        if (JSON_HEDLEY_UNLIKELY(first.m_object != last.m_object))
+        {
+            JSON_THROW(invalid_iterator::create(210, "iterators do not fit", *this));
+        }
+
+        // passed iterators must belong to objects
+        if (JSON_HEDLEY_UNLIKELY(!first.m_object->is_object()))
+        {
+            JSON_THROW(invalid_iterator::create(202, "iterators first and last must point to objects", *this));
+        }
+
+        m_value.object->insert(first.m_it.object_iterator, last.m_it.object_iterator);
+    }
+
+    /// @brief updates a JSON object from another object, overwriting existing keys
+    /// @sa https://json.nlohmann.me/api/basic_json/update/
+    void update(const_reference j, bool merge_objects = false)
+    {
+        update(j.begin(), j.end(), merge_objects);
+    }
+
+    /// @brief updates a JSON object from another object, overwriting existing keys
+    /// @sa https://json.nlohmann.me/api/basic_json/update/
+    void update(const_iterator first, const_iterator last, bool merge_objects = false)
+    {
+        // implicitly convert null value to an empty object
+        if (is_null())
+        {
+            m_type = value_t::object;
+            m_value.object = create<object_t>();
+            assert_invariant();
+        }
+
+        if (JSON_HEDLEY_UNLIKELY(!is_object()))
+        {
+            JSON_THROW(type_error::create(312, "cannot use update() with " + std::string(type_name()), *this));
+        }
+
+        // check if range iterators belong to the same JSON object
+        if (JSON_HEDLEY_UNLIKELY(first.m_object != last.m_object))
+        {
+            JSON_THROW(invalid_iterator::create(210, "iterators do not fit", *this));
+        }
+
+        // passed iterators must belong to objects
+        if (JSON_HEDLEY_UNLIKELY(!first.m_object->is_object()))
+        {
+            JSON_THROW(type_error::create(312, "cannot use update() with " + std::string(first.m_object->type_name()), *first.m_object));
+        }
+
+        for (auto it = first; it != last; ++it)
+        {
+            if (merge_objects && it.value().is_object())
+            {
+                auto it2 = m_value.object->find(it.key());
+                if (it2 != m_value.object->end())
+                {
+                    it2->second.update(it.value(), true);
+                    continue;
+                }
+            }
+            m_value.object->operator[](it.key()) = it.value();
+#if JSON_DIAGNOSTICS
+            m_value.object->operator[](it.key()).m_parent = this;
+#endif
+        }
+    }
+
+    /// @brief exchanges the values
+    /// @sa https://json.nlohmann.me/api/basic_json/swap/
+    void swap(reference other) noexcept (
+        std::is_nothrow_move_constructible<value_t>::value&&
+        std::is_nothrow_move_assignable<value_t>::value&&
+        std::is_nothrow_move_constructible<json_value>::value&&
+        std::is_nothrow_move_assignable<json_value>::value
+    )
+    {
+        std::swap(m_type, other.m_type);
+        std::swap(m_value, other.m_value);
+
+        set_parents();
+        other.set_parents();
+        assert_invariant();
+    }
+
+    /// @brief exchanges the values
+    /// @sa https://json.nlohmann.me/api/basic_json/swap/
+    friend void swap(reference left, reference right) noexcept (
+        std::is_nothrow_move_constructible<value_t>::value&&
+        std::is_nothrow_move_assignable<value_t>::value&&
+        std::is_nothrow_move_constructible<json_value>::value&&
+        std::is_nothrow_move_assignable<json_value>::value
+    )
+    {
+        left.swap(right);
+    }
+
+    /// @brief exchanges the values
+    /// @sa https://json.nlohmann.me/api/basic_json/swap/
+    void swap(array_t& other) // NOLINT(bugprone-exception-escape)
+    {
+        // swap only works for arrays
+        if (JSON_HEDLEY_LIKELY(is_array()))
+        {
+            std::swap(*(m_value.array), other);
+        }
+        else
+        {
+            JSON_THROW(type_error::create(310, "cannot use swap() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @brief exchanges the values
+    /// @sa https://json.nlohmann.me/api/basic_json/swap/
+    void swap(object_t& other) // NOLINT(bugprone-exception-escape)
+    {
+        // swap only works for objects
+        if (JSON_HEDLEY_LIKELY(is_object()))
+        {
+            std::swap(*(m_value.object), other);
+        }
+        else
+        {
+            JSON_THROW(type_error::create(310, "cannot use swap() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @brief exchanges the values
+    /// @sa https://json.nlohmann.me/api/basic_json/swap/
+    void swap(string_t& other) // NOLINT(bugprone-exception-escape)
+    {
+        // swap only works for strings
+        if (JSON_HEDLEY_LIKELY(is_string()))
+        {
+            std::swap(*(m_value.string), other);
+        }
+        else
+        {
+            JSON_THROW(type_error::create(310, "cannot use swap() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @brief exchanges the values
+    /// @sa https://json.nlohmann.me/api/basic_json/swap/
+    void swap(binary_t& other) // NOLINT(bugprone-exception-escape)
+    {
+        // swap only works for strings
+        if (JSON_HEDLEY_LIKELY(is_binary()))
+        {
+            std::swap(*(m_value.binary), other);
+        }
+        else
+        {
+            JSON_THROW(type_error::create(310, "cannot use swap() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @brief exchanges the values
+    /// @sa https://json.nlohmann.me/api/basic_json/swap/
+    void swap(typename binary_t::container_type& other) // NOLINT(bugprone-exception-escape)
+    {
+        // swap only works for strings
+        if (JSON_HEDLEY_LIKELY(is_binary()))
+        {
+            std::swap(*(m_value.binary), other);
+        }
+        else
+        {
+            JSON_THROW(type_error::create(310, "cannot use swap() with " + std::string(type_name()), *this));
+        }
+    }
+
+    /// @}
+
+  public:
+    //////////////////////////////////////////
+    // lexicographical comparison operators //
+    //////////////////////////////////////////
+
+    /// @name lexicographical comparison operators
+    /// @{
+
+    /// @brief comparison: equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_eq/
+    friend bool operator==(const_reference lhs, const_reference rhs) noexcept
+    {
+#ifdef __GNUC__
+#pragma GCC diagnostic push
+#pragma GCC diagnostic ignored "-Wfloat-equal"
+#endif
+        const auto lhs_type = lhs.type();
+        const auto rhs_type = rhs.type();
+
+        if (lhs_type == rhs_type)
+        {
+            switch (lhs_type)
+            {
+                case value_t::array:
+                    return *lhs.m_value.array == *rhs.m_value.array;
+
+                case value_t::object:
+                    return *lhs.m_value.object == *rhs.m_value.object;
+
+                case value_t::null:
+                    return true;
+
+                case value_t::string:
+                    return *lhs.m_value.string == *rhs.m_value.string;
+
+                case value_t::boolean:
+                    return lhs.m_value.boolean == rhs.m_value.boolean;
+
+                case value_t::number_integer:
+                    return lhs.m_value.number_integer == rhs.m_value.number_integer;
+
+                case value_t::number_unsigned:
+                    return lhs.m_value.number_unsigned == rhs.m_value.number_unsigned;
+
+                case value_t::number_float:
+                    return lhs.m_value.number_float == rhs.m_value.number_float;
+
+                case value_t::binary:
+                    return *lhs.m_value.binary == *rhs.m_value.binary;
+
+                case value_t::discarded:
+                default:
+                    return false;
+            }
+        }
+        else if (lhs_type == value_t::number_integer && rhs_type == value_t::number_float)
+        {
+            return static_cast<number_float_t>(lhs.m_value.number_integer) == rhs.m_value.number_float;
+        }
+        else if (lhs_type == value_t::number_float && rhs_type == value_t::number_integer)
+        {
+            return lhs.m_value.number_float == static_cast<number_float_t>(rhs.m_value.number_integer);
+        }
+        else if (lhs_type == value_t::number_unsigned && rhs_type == value_t::number_float)
+        {
+            return static_cast<number_float_t>(lhs.m_value.number_unsigned) == rhs.m_value.number_float;
+        }
+        else if (lhs_type == value_t::number_float && rhs_type == value_t::number_unsigned)
+        {
+            return lhs.m_value.number_float == static_cast<number_float_t>(rhs.m_value.number_unsigned);
+        }
+        else if (lhs_type == value_t::number_unsigned && rhs_type == value_t::number_integer)
+        {
+            return static_cast<number_integer_t>(lhs.m_value.number_unsigned) == rhs.m_value.number_integer;
+        }
+        else if (lhs_type == value_t::number_integer && rhs_type == value_t::number_unsigned)
+        {
+            return lhs.m_value.number_integer == static_cast<number_integer_t>(rhs.m_value.number_unsigned);
+        }
+
+        return false;
+#ifdef __GNUC__
+#pragma GCC diagnostic pop
+#endif
+    }
+
+    /// @brief comparison: equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_eq/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator==(const_reference lhs, ScalarType rhs) noexcept
+    {
+        return lhs == basic_json(rhs);
+    }
+
+    /// @brief comparison: equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_eq/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator==(ScalarType lhs, const_reference rhs) noexcept
+    {
+        return basic_json(lhs) == rhs;
+    }
+
+    /// @brief comparison: not equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_ne/
+    friend bool operator!=(const_reference lhs, const_reference rhs) noexcept
+    {
+        return !(lhs == rhs);
+    }
+
+    /// @brief comparison: not equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_ne/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator!=(const_reference lhs, ScalarType rhs) noexcept
+    {
+        return lhs != basic_json(rhs);
+    }
+
+    /// @brief comparison: not equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_ne/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator!=(ScalarType lhs, const_reference rhs) noexcept
+    {
+        return basic_json(lhs) != rhs;
+    }
+
+    /// @brief comparison: less than
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_lt/
+    friend bool operator<(const_reference lhs, const_reference rhs) noexcept
+    {
+        const auto lhs_type = lhs.type();
+        const auto rhs_type = rhs.type();
+
+        if (lhs_type == rhs_type)
+        {
+            switch (lhs_type)
+            {
+                case value_t::array:
+                    // note parentheses are necessary, see
+                    // https://github.com/nlohmann/json/issues/1530
+                    return (*lhs.m_value.array) < (*rhs.m_value.array);
+
+                case value_t::object:
+                    return (*lhs.m_value.object) < (*rhs.m_value.object);
+
+                case value_t::null:
+                    return false;
+
+                case value_t::string:
+                    return (*lhs.m_value.string) < (*rhs.m_value.string);
+
+                case value_t::boolean:
+                    return (lhs.m_value.boolean) < (rhs.m_value.boolean);
+
+                case value_t::number_integer:
+                    return (lhs.m_value.number_integer) < (rhs.m_value.number_integer);
+
+                case value_t::number_unsigned:
+                    return (lhs.m_value.number_unsigned) < (rhs.m_value.number_unsigned);
+
+                case value_t::number_float:
+                    return (lhs.m_value.number_float) < (rhs.m_value.number_float);
+
+                case value_t::binary:
+                    return (*lhs.m_value.binary) < (*rhs.m_value.binary);
+
+                case value_t::discarded:
+                default:
+                    return false;
+            }
+        }
+        else if (lhs_type == value_t::number_integer && rhs_type == value_t::number_float)
+        {
+            return static_cast<number_float_t>(lhs.m_value.number_integer) < rhs.m_value.number_float;
+        }
+        else if (lhs_type == value_t::number_float && rhs_type == value_t::number_integer)
+        {
+            return lhs.m_value.number_float < static_cast<number_float_t>(rhs.m_value.number_integer);
+        }
+        else if (lhs_type == value_t::number_unsigned && rhs_type == value_t::number_float)
+        {
+            return static_cast<number_float_t>(lhs.m_value.number_unsigned) < rhs.m_value.number_float;
+        }
+        else if (lhs_type == value_t::number_float && rhs_type == value_t::number_unsigned)
+        {
+            return lhs.m_value.number_float < static_cast<number_float_t>(rhs.m_value.number_unsigned);
+        }
+        else if (lhs_type == value_t::number_integer && rhs_type == value_t::number_unsigned)
+        {
+            return lhs.m_value.number_integer < static_cast<number_integer_t>(rhs.m_value.number_unsigned);
+        }
+        else if (lhs_type == value_t::number_unsigned && rhs_type == value_t::number_integer)
+        {
+            return static_cast<number_integer_t>(lhs.m_value.number_unsigned) < rhs.m_value.number_integer;
+        }
+
+        // We only reach this line if we cannot compare values. In that case,
+        // we compare types. Note we have to call the operator explicitly,
+        // because MSVC has problems otherwise.
+        return operator<(lhs_type, rhs_type);
+    }
+
+    /// @brief comparison: less than
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_lt/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator<(const_reference lhs, ScalarType rhs) noexcept
+    {
+        return lhs < basic_json(rhs);
+    }
+
+    /// @brief comparison: less than
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_lt/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator<(ScalarType lhs, const_reference rhs) noexcept
+    {
+        return basic_json(lhs) < rhs;
+    }
+
+    /// @brief comparison: less than or equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_le/
+    friend bool operator<=(const_reference lhs, const_reference rhs) noexcept
+    {
+        return !(rhs < lhs);
+    }
+
+    /// @brief comparison: less than or equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_le/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator<=(const_reference lhs, ScalarType rhs) noexcept
+    {
+        return lhs <= basic_json(rhs);
+    }
+
+    /// @brief comparison: less than or equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_le/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator<=(ScalarType lhs, const_reference rhs) noexcept
+    {
+        return basic_json(lhs) <= rhs;
+    }
+
+    /// @brief comparison: greater than
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_gt/
+    friend bool operator>(const_reference lhs, const_reference rhs) noexcept
+    {
+        return !(lhs <= rhs);
+    }
+
+    /// @brief comparison: greater than
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_gt/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator>(const_reference lhs, ScalarType rhs) noexcept
+    {
+        return lhs > basic_json(rhs);
+    }
+
+    /// @brief comparison: greater than
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_gt/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator>(ScalarType lhs, const_reference rhs) noexcept
+    {
+        return basic_json(lhs) > rhs;
+    }
+
+    /// @brief comparison: greater than or equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_ge/
+    friend bool operator>=(const_reference lhs, const_reference rhs) noexcept
+    {
+        return !(lhs < rhs);
+    }
+
+    /// @brief comparison: greater than or equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_ge/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator>=(const_reference lhs, ScalarType rhs) noexcept
+    {
+        return lhs >= basic_json(rhs);
+    }
+
+    /// @brief comparison: greater than or equal
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_ge/
+    template<typename ScalarType, typename std::enable_if<
+                 std::is_scalar<ScalarType>::value, int>::type = 0>
+    friend bool operator>=(ScalarType lhs, const_reference rhs) noexcept
+    {
+        return basic_json(lhs) >= rhs;
+    }
+
+    /// @}
+
+    ///////////////////
+    // serialization //
+    ///////////////////
+
+    /// @name serialization
+    /// @{
+#ifndef JSON_NO_IO
+    /// @brief serialize to stream
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_ltlt/
+    friend std::ostream& operator<<(std::ostream& o, const basic_json& j)
+    {
+        // read width member and use it as indentation parameter if nonzero
+        const bool pretty_print = o.width() > 0;
+        const auto indentation = pretty_print ? o.width() : 0;
+
+        // reset width to 0 for subsequent calls to this stream
+        o.width(0);
+
+        // do the actual serialization
+        serializer s(detail::output_adapter<char>(o), o.fill());
+        s.dump(j, pretty_print, false, static_cast<unsigned int>(indentation));
+        return o;
+    }
+
+    /// @brief serialize to stream
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_ltlt/
+    /// @deprecated This function is deprecated since 3.0.0 and will be removed in
+    ///             version 4.0.0 of the library. Please use
+    ///             operator<<(std::ostream&, const basic_json&) instead; that is,
+    ///             replace calls like `j >> o;` with `o << j;`.
+    JSON_HEDLEY_DEPRECATED_FOR(3.0.0, operator<<(std::ostream&, const basic_json&))
+    friend std::ostream& operator>>(const basic_json& j, std::ostream& o)
+    {
+        return o << j;
+    }
+#endif  // JSON_NO_IO
+    /// @}
+
+
+    /////////////////////
+    // deserialization //
+    /////////////////////
+
+    /// @name deserialization
+    /// @{
+
+    /// @brief deserialize from a compatible input
+    /// @sa https://json.nlohmann.me/api/basic_json/parse/
+    template<typename InputType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json parse(InputType&& i,
+                            const parser_callback_t cb = nullptr,
+                            const bool allow_exceptions = true,
+                            const bool ignore_comments = false)
+    {
+        basic_json result;
+        parser(detail::input_adapter(std::forward<InputType>(i)), cb, allow_exceptions, ignore_comments).parse(true, result);
+        return result;
+    }
+
+    /// @brief deserialize from a pair of character iterators
+    /// @sa https://json.nlohmann.me/api/basic_json/parse/
+    template<typename IteratorType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json parse(IteratorType first,
+                            IteratorType last,
+                            const parser_callback_t cb = nullptr,
+                            const bool allow_exceptions = true,
+                            const bool ignore_comments = false)
+    {
+        basic_json result;
+        parser(detail::input_adapter(std::move(first), std::move(last)), cb, allow_exceptions, ignore_comments).parse(true, result);
+        return result;
+    }
+
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, parse(ptr, ptr + len))
+    static basic_json parse(detail::span_input_adapter&& i,
+                            const parser_callback_t cb = nullptr,
+                            const bool allow_exceptions = true,
+                            const bool ignore_comments = false)
+    {
+        basic_json result;
+        parser(i.get(), cb, allow_exceptions, ignore_comments).parse(true, result);
+        return result;
+    }
+
+    /// @brief check if the input is valid JSON
+    /// @sa https://json.nlohmann.me/api/basic_json/accept/
+    template<typename InputType>
+    static bool accept(InputType&& i,
+                       const bool ignore_comments = false)
+    {
+        return parser(detail::input_adapter(std::forward<InputType>(i)), nullptr, false, ignore_comments).accept(true);
+    }
+
+    /// @brief check if the input is valid JSON
+    /// @sa https://json.nlohmann.me/api/basic_json/accept/
+    template<typename IteratorType>
+    static bool accept(IteratorType first, IteratorType last,
+                       const bool ignore_comments = false)
+    {
+        return parser(detail::input_adapter(std::move(first), std::move(last)), nullptr, false, ignore_comments).accept(true);
+    }
+
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, accept(ptr, ptr + len))
+    static bool accept(detail::span_input_adapter&& i,
+                       const bool ignore_comments = false)
+    {
+        return parser(i.get(), nullptr, false, ignore_comments).accept(true);
+    }
+
+    /// @brief generate SAX events
+    /// @sa https://json.nlohmann.me/api/basic_json/sax_parse/
+    template <typename InputType, typename SAX>
+    JSON_HEDLEY_NON_NULL(2)
+    static bool sax_parse(InputType&& i, SAX* sax,
+                          input_format_t format = input_format_t::json,
+                          const bool strict = true,
+                          const bool ignore_comments = false)
+    {
+        auto ia = detail::input_adapter(std::forward<InputType>(i));
+        return format == input_format_t::json
+               ? parser(std::move(ia), nullptr, true, ignore_comments).sax_parse(sax, strict)
+               : detail::binary_reader<basic_json, decltype(ia), SAX>(std::move(ia)).sax_parse(format, sax, strict);
+    }
+
+    /// @brief generate SAX events
+    /// @sa https://json.nlohmann.me/api/basic_json/sax_parse/
+    template<class IteratorType, class SAX>
+    JSON_HEDLEY_NON_NULL(3)
+    static bool sax_parse(IteratorType first, IteratorType last, SAX* sax,
+                          input_format_t format = input_format_t::json,
+                          const bool strict = true,
+                          const bool ignore_comments = false)
+    {
+        auto ia = detail::input_adapter(std::move(first), std::move(last));
+        return format == input_format_t::json
+               ? parser(std::move(ia), nullptr, true, ignore_comments).sax_parse(sax, strict)
+               : detail::binary_reader<basic_json, decltype(ia), SAX>(std::move(ia)).sax_parse(format, sax, strict);
+    }
+
+    /// @brief generate SAX events
+    /// @sa https://json.nlohmann.me/api/basic_json/sax_parse/
+    /// @deprecated This function is deprecated since 3.8.0 and will be removed in
+    ///             version 4.0.0 of the library. Please use
+    ///             sax_parse(ptr, ptr + len) instead.
+    template <typename SAX>
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, sax_parse(ptr, ptr + len, ...))
+    JSON_HEDLEY_NON_NULL(2)
+    static bool sax_parse(detail::span_input_adapter&& i, SAX* sax,
+                          input_format_t format = input_format_t::json,
+                          const bool strict = true,
+                          const bool ignore_comments = false)
+    {
+        auto ia = i.get();
+        return format == input_format_t::json
+               // NOLINTNEXTLINE(hicpp-move-const-arg,performance-move-const-arg)
+               ? parser(std::move(ia), nullptr, true, ignore_comments).sax_parse(sax, strict)
+               // NOLINTNEXTLINE(hicpp-move-const-arg,performance-move-const-arg)
+               : detail::binary_reader<basic_json, decltype(ia), SAX>(std::move(ia)).sax_parse(format, sax, strict);
+    }
+#ifndef JSON_NO_IO
+    /// @brief deserialize from stream
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_gtgt/
+    /// @deprecated This stream operator is deprecated since 3.0.0 and will be removed in
+    ///             version 4.0.0 of the library. Please use
+    ///             operator>>(std::istream&, basic_json&) instead; that is,
+    ///             replace calls like `j << i;` with `i >> j;`.
+    JSON_HEDLEY_DEPRECATED_FOR(3.0.0, operator>>(std::istream&, basic_json&))
+    friend std::istream& operator<<(basic_json& j, std::istream& i)
+    {
+        return operator>>(i, j);
+    }
+
+    /// @brief deserialize from stream
+    /// @sa https://json.nlohmann.me/api/basic_json/operator_gtgt/
+    friend std::istream& operator>>(std::istream& i, basic_json& j)
+    {
+        parser(detail::input_adapter(i)).parse(false, j);
+        return i;
+    }
+#endif  // JSON_NO_IO
+    /// @}
+
+    ///////////////////////////
+    // convenience functions //
+    ///////////////////////////
+
+    /// @brief return the type as string
+    /// @sa https://json.nlohmann.me/api/basic_json/type_name/
+    JSON_HEDLEY_RETURNS_NON_NULL
+    const char* type_name() const noexcept
+    {
+        switch (m_type)
+        {
+            case value_t::null:
+                return "null";
+            case value_t::object:
+                return "object";
+            case value_t::array:
+                return "array";
+            case value_t::string:
+                return "string";
+            case value_t::boolean:
+                return "boolean";
+            case value_t::binary:
+                return "binary";
+            case value_t::discarded:
+                return "discarded";
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            default:
+                return "number";
+        }
+    }
+
+
+  JSON_PRIVATE_UNLESS_TESTED:
+    //////////////////////
+    // member variables //
+    //////////////////////
+
+    /// the type of the current element
+    value_t m_type = value_t::null;
+
+    /// the value of the current element
+    json_value m_value = {};
+
+#if JSON_DIAGNOSTICS
+    /// a pointer to a parent value (for debugging purposes)
+    basic_json* m_parent = nullptr;
+#endif
+
+    //////////////////////////////////////////
+    // binary serialization/deserialization //
+    //////////////////////////////////////////
+
+    /// @name binary serialization/deserialization support
+    /// @{
+
+  public:
+    /// @brief create a CBOR serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_cbor/
+    static std::vector<std::uint8_t> to_cbor(const basic_json& j)
+    {
+        std::vector<std::uint8_t> result;
+        to_cbor(j, result);
+        return result;
+    }
+
+    /// @brief create a CBOR serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_cbor/
+    static void to_cbor(const basic_json& j, detail::output_adapter<std::uint8_t> o)
+    {
+        binary_writer<std::uint8_t>(o).write_cbor(j);
+    }
+
+    /// @brief create a CBOR serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_cbor/
+    static void to_cbor(const basic_json& j, detail::output_adapter<char> o)
+    {
+        binary_writer<char>(o).write_cbor(j);
+    }
+
+    /// @brief create a MessagePack serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_msgpack/
+    static std::vector<std::uint8_t> to_msgpack(const basic_json& j)
+    {
+        std::vector<std::uint8_t> result;
+        to_msgpack(j, result);
+        return result;
+    }
+
+    /// @brief create a MessagePack serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_msgpack/
+    static void to_msgpack(const basic_json& j, detail::output_adapter<std::uint8_t> o)
+    {
+        binary_writer<std::uint8_t>(o).write_msgpack(j);
+    }
+
+    /// @brief create a MessagePack serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_msgpack/
+    static void to_msgpack(const basic_json& j, detail::output_adapter<char> o)
+    {
+        binary_writer<char>(o).write_msgpack(j);
+    }
+
+    /// @brief create a UBJSON serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_ubjson/
+    static std::vector<std::uint8_t> to_ubjson(const basic_json& j,
+            const bool use_size = false,
+            const bool use_type = false)
+    {
+        std::vector<std::uint8_t> result;
+        to_ubjson(j, result, use_size, use_type);
+        return result;
+    }
+
+    /// @brief create a UBJSON serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_ubjson/
+    static void to_ubjson(const basic_json& j, detail::output_adapter<std::uint8_t> o,
+                          const bool use_size = false, const bool use_type = false)
+    {
+        binary_writer<std::uint8_t>(o).write_ubjson(j, use_size, use_type);
+    }
+
+    /// @brief create a UBJSON serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_ubjson/
+    static void to_ubjson(const basic_json& j, detail::output_adapter<char> o,
+                          const bool use_size = false, const bool use_type = false)
+    {
+        binary_writer<char>(o).write_ubjson(j, use_size, use_type);
+    }
+
+    /// @brief create a BSON serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_bson/
+    static std::vector<std::uint8_t> to_bson(const basic_json& j)
+    {
+        std::vector<std::uint8_t> result;
+        to_bson(j, result);
+        return result;
+    }
+
+    /// @brief create a BSON serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_bson/
+    static void to_bson(const basic_json& j, detail::output_adapter<std::uint8_t> o)
+    {
+        binary_writer<std::uint8_t>(o).write_bson(j);
+    }
+
+    /// @brief create a BSON serialization of a given JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/to_bson/
+    static void to_bson(const basic_json& j, detail::output_adapter<char> o)
+    {
+        binary_writer<char>(o).write_bson(j);
+    }
+
+    /// @brief create a JSON value from an input in CBOR format
+    /// @sa https://json.nlohmann.me/api/basic_json/from_cbor/
+    template<typename InputType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json from_cbor(InputType&& i,
+                                const bool strict = true,
+                                const bool allow_exceptions = true,
+                                const cbor_tag_handler_t tag_handler = cbor_tag_handler_t::error)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = detail::input_adapter(std::forward<InputType>(i));
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::cbor, &sdp, strict, tag_handler);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    /// @brief create a JSON value from an input in CBOR format
+    /// @sa https://json.nlohmann.me/api/basic_json/from_cbor/
+    template<typename IteratorType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json from_cbor(IteratorType first, IteratorType last,
+                                const bool strict = true,
+                                const bool allow_exceptions = true,
+                                const cbor_tag_handler_t tag_handler = cbor_tag_handler_t::error)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = detail::input_adapter(std::move(first), std::move(last));
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::cbor, &sdp, strict, tag_handler);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    template<typename T>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, from_cbor(ptr, ptr + len))
+    static basic_json from_cbor(const T* ptr, std::size_t len,
+                                const bool strict = true,
+                                const bool allow_exceptions = true,
+                                const cbor_tag_handler_t tag_handler = cbor_tag_handler_t::error)
+    {
+        return from_cbor(ptr, ptr + len, strict, allow_exceptions, tag_handler);
+    }
+
+
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, from_cbor(ptr, ptr + len))
+    static basic_json from_cbor(detail::span_input_adapter&& i,
+                                const bool strict = true,
+                                const bool allow_exceptions = true,
+                                const cbor_tag_handler_t tag_handler = cbor_tag_handler_t::error)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = i.get();
+        // NOLINTNEXTLINE(hicpp-move-const-arg,performance-move-const-arg)
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::cbor, &sdp, strict, tag_handler);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    /// @brief create a JSON value from an input in MessagePack format
+    /// @sa https://json.nlohmann.me/api/basic_json/from_msgpack/
+    template<typename InputType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json from_msgpack(InputType&& i,
+                                   const bool strict = true,
+                                   const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = detail::input_adapter(std::forward<InputType>(i));
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::msgpack, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    /// @brief create a JSON value from an input in MessagePack format
+    /// @sa https://json.nlohmann.me/api/basic_json/from_msgpack/
+    template<typename IteratorType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json from_msgpack(IteratorType first, IteratorType last,
+                                   const bool strict = true,
+                                   const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = detail::input_adapter(std::move(first), std::move(last));
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::msgpack, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    template<typename T>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, from_msgpack(ptr, ptr + len))
+    static basic_json from_msgpack(const T* ptr, std::size_t len,
+                                   const bool strict = true,
+                                   const bool allow_exceptions = true)
+    {
+        return from_msgpack(ptr, ptr + len, strict, allow_exceptions);
+    }
+
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, from_msgpack(ptr, ptr + len))
+    static basic_json from_msgpack(detail::span_input_adapter&& i,
+                                   const bool strict = true,
+                                   const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = i.get();
+        // NOLINTNEXTLINE(hicpp-move-const-arg,performance-move-const-arg)
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::msgpack, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    /// @brief create a JSON value from an input in UBJSON format
+    /// @sa https://json.nlohmann.me/api/basic_json/from_ubjson/
+    template<typename InputType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json from_ubjson(InputType&& i,
+                                  const bool strict = true,
+                                  const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = detail::input_adapter(std::forward<InputType>(i));
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::ubjson, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    /// @brief create a JSON value from an input in UBJSON format
+    /// @sa https://json.nlohmann.me/api/basic_json/from_ubjson/
+    template<typename IteratorType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json from_ubjson(IteratorType first, IteratorType last,
+                                  const bool strict = true,
+                                  const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = detail::input_adapter(std::move(first), std::move(last));
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::ubjson, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    template<typename T>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, from_ubjson(ptr, ptr + len))
+    static basic_json from_ubjson(const T* ptr, std::size_t len,
+                                  const bool strict = true,
+                                  const bool allow_exceptions = true)
+    {
+        return from_ubjson(ptr, ptr + len, strict, allow_exceptions);
+    }
+
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, from_ubjson(ptr, ptr + len))
+    static basic_json from_ubjson(detail::span_input_adapter&& i,
+                                  const bool strict = true,
+                                  const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = i.get();
+        // NOLINTNEXTLINE(hicpp-move-const-arg,performance-move-const-arg)
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::ubjson, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    /// @brief create a JSON value from an input in BSON format
+    /// @sa https://json.nlohmann.me/api/basic_json/from_bson/
+    template<typename InputType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json from_bson(InputType&& i,
+                                const bool strict = true,
+                                const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = detail::input_adapter(std::forward<InputType>(i));
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::bson, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    /// @brief create a JSON value from an input in BSON format
+    /// @sa https://json.nlohmann.me/api/basic_json/from_bson/
+    template<typename IteratorType>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json from_bson(IteratorType first, IteratorType last,
+                                const bool strict = true,
+                                const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = detail::input_adapter(std::move(first), std::move(last));
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::bson, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+
+    template<typename T>
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, from_bson(ptr, ptr + len))
+    static basic_json from_bson(const T* ptr, std::size_t len,
+                                const bool strict = true,
+                                const bool allow_exceptions = true)
+    {
+        return from_bson(ptr, ptr + len, strict, allow_exceptions);
+    }
+
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    JSON_HEDLEY_DEPRECATED_FOR(3.8.0, from_bson(ptr, ptr + len))
+    static basic_json from_bson(detail::span_input_adapter&& i,
+                                const bool strict = true,
+                                const bool allow_exceptions = true)
+    {
+        basic_json result;
+        detail::json_sax_dom_parser<basic_json> sdp(result, allow_exceptions);
+        auto ia = i.get();
+        // NOLINTNEXTLINE(hicpp-move-const-arg,performance-move-const-arg)
+        const bool res = binary_reader<decltype(ia)>(std::move(ia)).sax_parse(input_format_t::bson, &sdp, strict);
+        return res ? result : basic_json(value_t::discarded);
+    }
+    /// @}
+
+    //////////////////////////
+    // JSON Pointer support //
+    //////////////////////////
+
+    /// @name JSON Pointer functions
+    /// @{
+
+    /// @brief access specified element via JSON Pointer
+    /// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
+    reference operator[](const json_pointer& ptr)
+    {
+        return ptr.get_unchecked(this);
+    }
+
+    /// @brief access specified element via JSON Pointer
+    /// @sa https://json.nlohmann.me/api/basic_json/operator%5B%5D/
+    const_reference operator[](const json_pointer& ptr) const
+    {
+        return ptr.get_unchecked(this);
+    }
+
+    /// @brief access specified element via JSON Pointer
+    /// @sa https://json.nlohmann.me/api/basic_json/at/
+    reference at(const json_pointer& ptr)
+    {
+        return ptr.get_checked(this);
+    }
+
+    /// @brief access specified element via JSON Pointer
+    /// @sa https://json.nlohmann.me/api/basic_json/at/
+    const_reference at(const json_pointer& ptr) const
+    {
+        return ptr.get_checked(this);
+    }
+
+    /// @brief return flattened JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/flatten/
+    basic_json flatten() const
+    {
+        basic_json result(value_t::object);
+        json_pointer::flatten("", *this, result);
+        return result;
+    }
+
+    /// @brief unflatten a previously flattened JSON value
+    /// @sa https://json.nlohmann.me/api/basic_json/unflatten/
+    basic_json unflatten() const
+    {
+        return json_pointer::unflatten(*this);
+    }
+
+    /// @}
+
+    //////////////////////////
+    // JSON Patch functions //
+    //////////////////////////
+
+    /// @name JSON Patch functions
+    /// @{
+
+    /// @brief applies a JSON patch
+    /// @sa https://json.nlohmann.me/api/basic_json/patch/
+    basic_json patch(const basic_json& json_patch) const
+    {
+        // make a working copy to apply the patch to
+        basic_json result = *this;
+
+        // the valid JSON Patch operations
+        enum class patch_operations {add, remove, replace, move, copy, test, invalid};
+
+        const auto get_op = [](const std::string & op)
+        {
+            if (op == "add")
+            {
+                return patch_operations::add;
+            }
+            if (op == "remove")
+            {
+                return patch_operations::remove;
+            }
+            if (op == "replace")
+            {
+                return patch_operations::replace;
+            }
+            if (op == "move")
+            {
+                return patch_operations::move;
+            }
+            if (op == "copy")
+            {
+                return patch_operations::copy;
+            }
+            if (op == "test")
+            {
+                return patch_operations::test;
+            }
+
+            return patch_operations::invalid;
+        };
+
+        // wrapper for "add" operation; add value at ptr
+        const auto operation_add = [&result](json_pointer & ptr, basic_json val)
+        {
+            // adding to the root of the target document means replacing it
+            if (ptr.empty())
+            {
+                result = val;
+                return;
+            }
+
+            // make sure the top element of the pointer exists
+            json_pointer top_pointer = ptr.top();
+            if (top_pointer != ptr)
+            {
+                result.at(top_pointer);
+            }
+
+            // get reference to parent of JSON pointer ptr
+            const auto last_path = ptr.back();
+            ptr.pop_back();
+            basic_json& parent = result[ptr];
+
+            switch (parent.m_type)
+            {
+                case value_t::null:
+                case value_t::object:
+                {
+                    // use operator[] to add value
+                    parent[last_path] = val;
+                    break;
+                }
+
+                case value_t::array:
+                {
+                    if (last_path == "-")
+                    {
+                        // special case: append to back
+                        parent.push_back(val);
+                    }
+                    else
+                    {
+                        const auto idx = json_pointer::array_index(last_path);
+                        if (JSON_HEDLEY_UNLIKELY(idx > parent.size()))
+                        {
+                            // avoid undefined behavior
+                            JSON_THROW(out_of_range::create(401, "array index " + std::to_string(idx) + " is out of range", parent));
+                        }
+
+                        // default case: insert add offset
+                        parent.insert(parent.begin() + static_cast<difference_type>(idx), val);
+                    }
+                    break;
+                }
+
+                // if there exists a parent it cannot be primitive
+                case value_t::string: // LCOV_EXCL_LINE
+                case value_t::boolean: // LCOV_EXCL_LINE
+                case value_t::number_integer: // LCOV_EXCL_LINE
+                case value_t::number_unsigned: // LCOV_EXCL_LINE
+                case value_t::number_float: // LCOV_EXCL_LINE
+                case value_t::binary: // LCOV_EXCL_LINE
+                case value_t::discarded: // LCOV_EXCL_LINE
+                default:            // LCOV_EXCL_LINE
+                    JSON_ASSERT(false); // NOLINT(cert-dcl03-c,hicpp-static-assert,misc-static-assert) LCOV_EXCL_LINE
+            }
+        };
+
+        // wrapper for "remove" operation; remove value at ptr
+        const auto operation_remove = [this, &result](json_pointer & ptr)
+        {
+            // get reference to parent of JSON pointer ptr
+            const auto last_path = ptr.back();
+            ptr.pop_back();
+            basic_json& parent = result.at(ptr);
+
+            // remove child
+            if (parent.is_object())
+            {
+                // perform range check
+                auto it = parent.find(last_path);
+                if (JSON_HEDLEY_LIKELY(it != parent.end()))
+                {
+                    parent.erase(it);
+                }
+                else
+                {
+                    JSON_THROW(out_of_range::create(403, "key '" + last_path + "' not found", *this));
+                }
+            }
+            else if (parent.is_array())
+            {
+                // note erase performs range check
+                parent.erase(json_pointer::array_index(last_path));
+            }
+        };
+
+        // type check: top level value must be an array
+        if (JSON_HEDLEY_UNLIKELY(!json_patch.is_array()))
+        {
+            JSON_THROW(parse_error::create(104, 0, "JSON patch must be an array of objects", json_patch));
+        }
+
+        // iterate and apply the operations
+        for (const auto& val : json_patch)
+        {
+            // wrapper to get a value for an operation
+            const auto get_value = [&val](const std::string & op,
+                                          const std::string & member,
+                                          bool string_type) -> basic_json &
+            {
+                // find value
+                auto it = val.m_value.object->find(member);
+
+                // context-sensitive error message
+                const auto error_msg = (op == "op") ? "operation" : "operation '" + op + "'";
+
+                // check if desired value is present
+                if (JSON_HEDLEY_UNLIKELY(it == val.m_value.object->end()))
+                {
+                    // NOLINTNEXTLINE(performance-inefficient-string-concatenation)
+                    JSON_THROW(parse_error::create(105, 0, error_msg + " must have member '" + member + "'", val));
+                }
+
+                // check if result is of type string
+                if (JSON_HEDLEY_UNLIKELY(string_type && !it->second.is_string()))
+                {
+                    // NOLINTNEXTLINE(performance-inefficient-string-concatenation)
+                    JSON_THROW(parse_error::create(105, 0, error_msg + " must have string member '" + member + "'", val));
+                }
+
+                // no error: return value
+                return it->second;
+            };
+
+            // type check: every element of the array must be an object
+            if (JSON_HEDLEY_UNLIKELY(!val.is_object()))
+            {
+                JSON_THROW(parse_error::create(104, 0, "JSON patch must be an array of objects", val));
+            }
+
+            // collect mandatory members
+            const auto op = get_value("op", "op", true).template get<std::string>();
+            const auto path = get_value(op, "path", true).template get<std::string>();
+            json_pointer ptr(path);
+
+            switch (get_op(op))
+            {
+                case patch_operations::add:
+                {
+                    operation_add(ptr, get_value("add", "value", false));
+                    break;
+                }
+
+                case patch_operations::remove:
+                {
+                    operation_remove(ptr);
+                    break;
+                }
+
+                case patch_operations::replace:
+                {
+                    // the "path" location must exist - use at()
+                    result.at(ptr) = get_value("replace", "value", false);
+                    break;
+                }
+
+                case patch_operations::move:
+                {
+                    const auto from_path = get_value("move", "from", true).template get<std::string>();
+                    json_pointer from_ptr(from_path);
+
+                    // the "from" location must exist - use at()
+                    basic_json v = result.at(from_ptr);
+
+                    // The move operation is functionally identical to a
+                    // "remove" operation on the "from" location, followed
+                    // immediately by an "add" operation at the target
+                    // location with the value that was just removed.
+                    operation_remove(from_ptr);
+                    operation_add(ptr, v);
+                    break;
+                }
+
+                case patch_operations::copy:
+                {
+                    const auto from_path = get_value("copy", "from", true).template get<std::string>();
+                    const json_pointer from_ptr(from_path);
+
+                    // the "from" location must exist - use at()
+                    basic_json v = result.at(from_ptr);
+
+                    // The copy is functionally identical to an "add"
+                    // operation at the target location using the value
+                    // specified in the "from" member.
+                    operation_add(ptr, v);
+                    break;
+                }
+
+                case patch_operations::test:
+                {
+                    bool success = false;
+                    JSON_TRY
+                    {
+                        // check if "value" matches the one at "path"
+                        // the "path" location must exist - use at()
+                        success = (result.at(ptr) == get_value("test", "value", false));
+                    }
+                    JSON_INTERNAL_CATCH (out_of_range&)
+                    {
+                        // ignore out of range errors: success remains false
+                    }
+
+                    // throw an exception if test fails
+                    if (JSON_HEDLEY_UNLIKELY(!success))
+                    {
+                        JSON_THROW(other_error::create(501, "unsuccessful: " + val.dump(), val));
+                    }
+
+                    break;
+                }
+
+                case patch_operations::invalid:
+                default:
+                {
+                    // op must be "add", "remove", "replace", "move", "copy", or
+                    // "test"
+                    JSON_THROW(parse_error::create(105, 0, "operation value '" + op + "' is invalid", val));
+                }
+            }
+        }
+
+        return result;
+    }
+
+    /// @brief creates a diff as a JSON patch
+    /// @sa https://json.nlohmann.me/api/basic_json/diff/
+    JSON_HEDLEY_WARN_UNUSED_RESULT
+    static basic_json diff(const basic_json& source, const basic_json& target,
+                           const std::string& path = "")
+    {
+        // the patch
+        basic_json result(value_t::array);
+
+        // if the values are the same, return empty patch
+        if (source == target)
+        {
+            return result;
+        }
+
+        if (source.type() != target.type())
+        {
+            // different types: replace value
+            result.push_back(
+            {
+                {"op", "replace"}, {"path", path}, {"value", target}
+            });
+            return result;
+        }
+
+        switch (source.type())
+        {
+            case value_t::array:
+            {
+                // first pass: traverse common elements
+                std::size_t i = 0;
+                while (i < source.size() && i < target.size())
+                {
+                    // recursive call to compare array values at index i
+                    auto temp_diff = diff(source[i], target[i], path + "/" + std::to_string(i));
+                    result.insert(result.end(), temp_diff.begin(), temp_diff.end());
+                    ++i;
+                }
+
+                // We now reached the end of at least one array
+                // in a second pass, traverse the remaining elements
+
+                // remove my remaining elements
+                const auto end_index = static_cast<difference_type>(result.size());
+                while (i < source.size())
+                {
+                    // add operations in reverse order to avoid invalid
+                    // indices
+                    result.insert(result.begin() + end_index, object(
+                    {
+                        {"op", "remove"},
+                        {"path", path + "/" + std::to_string(i)}
+                    }));
+                    ++i;
+                }
+
+                // add other remaining elements
+                while (i < target.size())
+                {
+                    result.push_back(
+                    {
+                        {"op", "add"},
+                        {"path", path + "/-"},
+                        {"value", target[i]}
+                    });
+                    ++i;
+                }
+
+                break;
+            }
+
+            case value_t::object:
+            {
+                // first pass: traverse this object's elements
+                for (auto it = source.cbegin(); it != source.cend(); ++it)
+                {
+                    // escape the key name to be used in a JSON patch
+                    const auto path_key = path + "/" + detail::escape(it.key());
+
+                    if (target.find(it.key()) != target.end())
+                    {
+                        // recursive call to compare object values at key it
+                        auto temp_diff = diff(it.value(), target[it.key()], path_key);
+                        result.insert(result.end(), temp_diff.begin(), temp_diff.end());
+                    }
+                    else
+                    {
+                        // found a key that is not in o -> remove it
+                        result.push_back(object(
+                        {
+                            {"op", "remove"}, {"path", path_key}
+                        }));
+                    }
+                }
+
+                // second pass: traverse other object's elements
+                for (auto it = target.cbegin(); it != target.cend(); ++it)
+                {
+                    if (source.find(it.key()) == source.end())
+                    {
+                        // found a key that is not in this -> add it
+                        const auto path_key = path + "/" + detail::escape(it.key());
+                        result.push_back(
+                        {
+                            {"op", "add"}, {"path", path_key},
+                            {"value", it.value()}
+                        });
+                    }
+                }
+
+                break;
+            }
+
+            case value_t::null:
+            case value_t::string:
+            case value_t::boolean:
+            case value_t::number_integer:
+            case value_t::number_unsigned:
+            case value_t::number_float:
+            case value_t::binary:
+            case value_t::discarded:
+            default:
+            {
+                // both primitive type: replace value
+                result.push_back(
+                {
+                    {"op", "replace"}, {"path", path}, {"value", target}
+                });
+                break;
+            }
+        }
+
+        return result;
+    }
+
+    /// @}
+
+    ////////////////////////////////
+    // JSON Merge Patch functions //
+    ////////////////////////////////
+
+    /// @name JSON Merge Patch functions
+    /// @{
+
+    /// @brief applies a JSON Merge Patch
+    /// @sa https://json.nlohmann.me/api/basic_json/merge_patch/
+    void merge_patch(const basic_json& apply_patch)
+    {
+        if (apply_patch.is_object())
+        {
+            if (!is_object())
+            {
+                *this = object();
+            }
+            for (auto it = apply_patch.begin(); it != apply_patch.end(); ++it)
+            {
+                if (it.value().is_null())
+                {
+                    erase(it.key());
+                }
+                else
+                {
+                    operator[](it.key()).merge_patch(it.value());
+                }
+            }
+        }
+        else
+        {
+            *this = apply_patch;
+        }
+    }
+
+    /// @}
+};
+
+/// @brief user-defined to_string function for JSON values
+/// @sa https://json.nlohmann.me/api/basic_json/to_string/
+NLOHMANN_BASIC_JSON_TPL_DECLARATION
+std::string to_string(const NLOHMANN_BASIC_JSON_TPL& j)
+{
+    return j.dump();
+}
+
+} // namespace nlohmann
+
+///////////////////////
+// nonmember support //
+///////////////////////
+
+namespace std // NOLINT(cert-dcl58-cpp)
+{
+
+/// @brief hash value for JSON objects
+/// @sa https://json.nlohmann.me/api/basic_json/std_hash/
+NLOHMANN_BASIC_JSON_TPL_DECLARATION
+struct hash<nlohmann::NLOHMANN_BASIC_JSON_TPL>
+{
+    std::size_t operator()(const nlohmann::NLOHMANN_BASIC_JSON_TPL& j) const
+    {
+        return nlohmann::detail::hash(j);
+    }
+};
+
+// specialization for std::less<value_t>
+template<>
+struct less< ::nlohmann::detail::value_t> // do not remove the space after '<', see https://github.com/nlohmann/json/pull/679
+{
+    /*!
+    @brief compare two value_t enum values
+    @since version 3.0.0
+    */
+    bool operator()(nlohmann::detail::value_t lhs,
+                    nlohmann::detail::value_t rhs) const noexcept
+    {
+        return nlohmann::detail::operator<(lhs, rhs);
+    }
+};
+
+// C++20 prohibit function specialization in the std namespace.
+#ifndef JSON_HAS_CPP_20
+
+/// @brief exchanges the values of two JSON objects
+/// @sa https://json.nlohmann.me/api/basic_json/std_swap/
+NLOHMANN_BASIC_JSON_TPL_DECLARATION
+inline void swap(nlohmann::NLOHMANN_BASIC_JSON_TPL& j1, nlohmann::NLOHMANN_BASIC_JSON_TPL& j2) noexcept(  // NOLINT(readability-inconsistent-declaration-parameter-name)
+    is_nothrow_move_constructible<nlohmann::NLOHMANN_BASIC_JSON_TPL>::value&&                          // NOLINT(misc-redundant-expression)
+    is_nothrow_move_assignable<nlohmann::NLOHMANN_BASIC_JSON_TPL>::value)
+{
+    j1.swap(j2);
+}
+
+#endif
+
+} // namespace std
+
+/// @brief user-defined string literal for JSON values
+/// @sa https://json.nlohmann.me/api/basic_json/operator_literal_json/
+JSON_HEDLEY_NON_NULL(1)
+inline nlohmann::json operator "" _json(const char* s, std::size_t n)
+{
+    return nlohmann::json::parse(s, s + n);
+}
+
+/// @brief user-defined string literal for JSON pointer
+/// @sa https://json.nlohmann.me/api/basic_json/operator_literal_json_pointer/
+JSON_HEDLEY_NON_NULL(1)
+inline nlohmann::json::json_pointer operator "" _json_pointer(const char* s, std::size_t n)
+{
+    return nlohmann::json::json_pointer(std::string(s, n));
+}
+
+// #include <nlohmann/detail/macro_unscope.hpp>
+
+
+// restore clang diagnostic settings
+#if defined(__clang__)
+    #pragma clang diagnostic pop
+#endif
+
+// clean up
+#undef JSON_ASSERT
+#undef JSON_INTERNAL_CATCH
+#undef JSON_CATCH
+#undef JSON_THROW
+#undef JSON_TRY
+#undef JSON_PRIVATE_UNLESS_TESTED
+#undef JSON_HAS_CPP_11
+#undef JSON_HAS_CPP_14
+#undef JSON_HAS_CPP_17
+#undef JSON_HAS_CPP_20
+#undef JSON_HAS_FILESYSTEM
+#undef JSON_HAS_EXPERIMENTAL_FILESYSTEM
+#undef NLOHMANN_BASIC_JSON_TPL_DECLARATION
+#undef NLOHMANN_BASIC_JSON_TPL
+#undef JSON_EXPLICIT
+#undef NLOHMANN_CAN_CALL_STD_FUNC_IMPL
+
+// #include <nlohmann/thirdparty/hedley/hedley_undef.hpp>
+
+
+#undef JSON_HEDLEY_ALWAYS_INLINE
+#undef JSON_HEDLEY_ARM_VERSION
+#undef JSON_HEDLEY_ARM_VERSION_CHECK
+#undef JSON_HEDLEY_ARRAY_PARAM
+#undef JSON_HEDLEY_ASSUME
+#undef JSON_HEDLEY_BEGIN_C_DECLS
+#undef JSON_HEDLEY_CLANG_HAS_ATTRIBUTE
+#undef JSON_HEDLEY_CLANG_HAS_BUILTIN
+#undef JSON_HEDLEY_CLANG_HAS_CPP_ATTRIBUTE
+#undef JSON_HEDLEY_CLANG_HAS_DECLSPEC_DECLSPEC_ATTRIBUTE
+#undef JSON_HEDLEY_CLANG_HAS_EXTENSION
+#undef JSON_HEDLEY_CLANG_HAS_FEATURE
+#undef JSON_HEDLEY_CLANG_HAS_WARNING
+#undef JSON_HEDLEY_COMPCERT_VERSION
+#undef JSON_HEDLEY_COMPCERT_VERSION_CHECK
+#undef JSON_HEDLEY_CONCAT
+#undef JSON_HEDLEY_CONCAT3
+#undef JSON_HEDLEY_CONCAT3_EX
+#undef JSON_HEDLEY_CONCAT_EX
+#undef JSON_HEDLEY_CONST
+#undef JSON_HEDLEY_CONSTEXPR
+#undef JSON_HEDLEY_CONST_CAST
+#undef JSON_HEDLEY_CPP_CAST
+#undef JSON_HEDLEY_CRAY_VERSION
+#undef JSON_HEDLEY_CRAY_VERSION_CHECK
+#undef JSON_HEDLEY_C_DECL
+#undef JSON_HEDLEY_DEPRECATED
+#undef JSON_HEDLEY_DEPRECATED_FOR
+#undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_CAST_QUAL
+#undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_CPP98_COMPAT_WRAP_
+#undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_DEPRECATED
+#undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_CPP_ATTRIBUTES
+#undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNKNOWN_PRAGMAS
+#undef JSON_HEDLEY_DIAGNOSTIC_DISABLE_UNUSED_FUNCTION
+#undef JSON_HEDLEY_DIAGNOSTIC_POP
+#undef JSON_HEDLEY_DIAGNOSTIC_PUSH
+#undef JSON_HEDLEY_DMC_VERSION
+#undef JSON_HEDLEY_DMC_VERSION_CHECK
+#undef JSON_HEDLEY_EMPTY_BASES
+#undef JSON_HEDLEY_EMSCRIPTEN_VERSION
+#undef JSON_HEDLEY_EMSCRIPTEN_VERSION_CHECK
+#undef JSON_HEDLEY_END_C_DECLS
+#undef JSON_HEDLEY_FLAGS
+#undef JSON_HEDLEY_FLAGS_CAST
+#undef JSON_HEDLEY_GCC_HAS_ATTRIBUTE
+#undef JSON_HEDLEY_GCC_HAS_BUILTIN
+#undef JSON_HEDLEY_GCC_HAS_CPP_ATTRIBUTE
+#undef JSON_HEDLEY_GCC_HAS_DECLSPEC_ATTRIBUTE
+#undef JSON_HEDLEY_GCC_HAS_EXTENSION
+#undef JSON_HEDLEY_GCC_HAS_FEATURE
+#undef JSON_HEDLEY_GCC_HAS_WARNING
+#undef JSON_HEDLEY_GCC_NOT_CLANG_VERSION_CHECK
+#undef JSON_HEDLEY_GCC_VERSION
+#undef JSON_HEDLEY_GCC_VERSION_CHECK
+#undef JSON_HEDLEY_GNUC_HAS_ATTRIBUTE
+#undef JSON_HEDLEY_GNUC_HAS_BUILTIN
+#undef JSON_HEDLEY_GNUC_HAS_CPP_ATTRIBUTE
+#undef JSON_HEDLEY_GNUC_HAS_DECLSPEC_ATTRIBUTE
+#undef JSON_HEDLEY_GNUC_HAS_EXTENSION
+#undef JSON_HEDLEY_GNUC_HAS_FEATURE
+#undef JSON_HEDLEY_GNUC_HAS_WARNING
+#undef JSON_HEDLEY_GNUC_VERSION
+#undef JSON_HEDLEY_GNUC_VERSION_CHECK
+#undef JSON_HEDLEY_HAS_ATTRIBUTE
+#undef JSON_HEDLEY_HAS_BUILTIN
+#undef JSON_HEDLEY_HAS_CPP_ATTRIBUTE
+#undef JSON_HEDLEY_HAS_CPP_ATTRIBUTE_NS
+#undef JSON_HEDLEY_HAS_DECLSPEC_ATTRIBUTE
+#undef JSON_HEDLEY_HAS_EXTENSION
+#undef JSON_HEDLEY_HAS_FEATURE
+#undef JSON_HEDLEY_HAS_WARNING
+#undef JSON_HEDLEY_IAR_VERSION
+#undef JSON_HEDLEY_IAR_VERSION_CHECK
+#undef JSON_HEDLEY_IBM_VERSION
+#undef JSON_HEDLEY_IBM_VERSION_CHECK
+#undef JSON_HEDLEY_IMPORT
+#undef JSON_HEDLEY_INLINE
+#undef JSON_HEDLEY_INTEL_CL_VERSION
+#undef JSON_HEDLEY_INTEL_CL_VERSION_CHECK
+#undef JSON_HEDLEY_INTEL_VERSION
+#undef JSON_HEDLEY_INTEL_VERSION_CHECK
+#undef JSON_HEDLEY_IS_CONSTANT
+#undef JSON_HEDLEY_IS_CONSTEXPR_
+#undef JSON_HEDLEY_LIKELY
+#undef JSON_HEDLEY_MALLOC
+#undef JSON_HEDLEY_MCST_LCC_VERSION
+#undef JSON_HEDLEY_MCST_LCC_VERSION_CHECK
+#undef JSON_HEDLEY_MESSAGE
+#undef JSON_HEDLEY_MSVC_VERSION
+#undef JSON_HEDLEY_MSVC_VERSION_CHECK
+#undef JSON_HEDLEY_NEVER_INLINE
+#undef JSON_HEDLEY_NON_NULL
+#undef JSON_HEDLEY_NO_ESCAPE
+#undef JSON_HEDLEY_NO_RETURN
+#undef JSON_HEDLEY_NO_THROW
+#undef JSON_HEDLEY_NULL
+#undef JSON_HEDLEY_PELLES_VERSION
+#undef JSON_HEDLEY_PELLES_VERSION_CHECK
+#undef JSON_HEDLEY_PGI_VERSION
+#undef JSON_HEDLEY_PGI_VERSION_CHECK
+#undef JSON_HEDLEY_PREDICT
+#undef JSON_HEDLEY_PRINTF_FORMAT
+#undef JSON_HEDLEY_PRIVATE
+#undef JSON_HEDLEY_PUBLIC
+#undef JSON_HEDLEY_PURE
+#undef JSON_HEDLEY_REINTERPRET_CAST
+#undef JSON_HEDLEY_REQUIRE
+#undef JSON_HEDLEY_REQUIRE_CONSTEXPR
+#undef JSON_HEDLEY_REQUIRE_MSG
+#undef JSON_HEDLEY_RESTRICT
+#undef JSON_HEDLEY_RETURNS_NON_NULL
+#undef JSON_HEDLEY_SENTINEL
+#undef JSON_HEDLEY_STATIC_ASSERT
+#undef JSON_HEDLEY_STATIC_CAST
+#undef JSON_HEDLEY_STRINGIFY
+#undef JSON_HEDLEY_STRINGIFY_EX
+#undef JSON_HEDLEY_SUNPRO_VERSION
+#undef JSON_HEDLEY_SUNPRO_VERSION_CHECK
+#undef JSON_HEDLEY_TINYC_VERSION
+#undef JSON_HEDLEY_TINYC_VERSION_CHECK
+#undef JSON_HEDLEY_TI_ARMCL_VERSION
+#undef JSON_HEDLEY_TI_ARMCL_VERSION_CHECK
+#undef JSON_HEDLEY_TI_CL2000_VERSION
+#undef JSON_HEDLEY_TI_CL2000_VERSION_CHECK
+#undef JSON_HEDLEY_TI_CL430_VERSION
+#undef JSON_HEDLEY_TI_CL430_VERSION_CHECK
+#undef JSON_HEDLEY_TI_CL6X_VERSION
+#undef JSON_HEDLEY_TI_CL6X_VERSION_CHECK
+#undef JSON_HEDLEY_TI_CL7X_VERSION
+#undef JSON_HEDLEY_TI_CL7X_VERSION_CHECK
+#undef JSON_HEDLEY_TI_CLPRU_VERSION
+#undef JSON_HEDLEY_TI_CLPRU_VERSION_CHECK
+#undef JSON_HEDLEY_TI_VERSION
+#undef JSON_HEDLEY_TI_VERSION_CHECK
+#undef JSON_HEDLEY_UNAVAILABLE
+#undef JSON_HEDLEY_UNLIKELY
+#undef JSON_HEDLEY_UNPREDICTABLE
+#undef JSON_HEDLEY_UNREACHABLE
+#undef JSON_HEDLEY_UNREACHABLE_RETURN
+#undef JSON_HEDLEY_VERSION
+#undef JSON_HEDLEY_VERSION_DECODE_MAJOR
+#undef JSON_HEDLEY_VERSION_DECODE_MINOR
+#undef JSON_HEDLEY_VERSION_DECODE_REVISION
+#undef JSON_HEDLEY_VERSION_ENCODE
+#undef JSON_HEDLEY_WARNING
+#undef JSON_HEDLEY_WARN_UNUSED_RESULT
+#undef JSON_HEDLEY_WARN_UNUSED_RESULT_MSG
+#undef JSON_HEDLEY_FALL_THROUGH
+
+
+
+#endif  // INCLUDE_NLOHMANN_JSON_HPP_
diff --git a/data-access/engine/imcopy.c.modif b/data-access/engine/imcopy.c.modif
new file mode 100644
index 0000000000000000000000000000000000000000..d481b8d1ad812a406bfc3ac43cb6dc9dc8044bff
--- /dev/null
+++ b/data-access/engine/imcopy.c.modif
@@ -0,0 +1,280 @@
+#include <string.h>
+#include <stdio.h>
+#include <stdlib.h>
+#include "fitsio.h"
+
+int main(int argc, char *argv[])
+{
+    fitsfile *infptr, *outfptr;   /* FITS file pointers defined in fitsio.h */
+    int status = 0, tstatus, ii = 1, iteration = 0, single = 0, hdupos;
+    int hdutype, bitpix, bytepix, naxis = 0, nkeys, datatype = 0, anynul;
+    long naxes[9] = {1, 1, 1, 1, 1, 1, 1, 1, 1};
+    long first, totpix = 0, npix;
+    double *array, bscale = 1.0, bzero = 0.0, nulval = 0.;
+    char card[81];
+
+    if (argc != 3)
+    {
+ printf("\n");
+ printf("Usage:  imcopy inputImage outputImage[compress]\n");
+ printf("\n");
+ printf("Copy an input image to an output image, optionally compressing\n");
+ printf("or uncompressing the image in the process.  If the [compress]\n");
+ printf("qualifier is appended to the output file name then the input image\n");
+ printf("will be compressed using the tile-compressed format.  In this format,\n");
+ printf("the image is divided into rectangular tiles and each tile of pixels\n");
+ printf("is compressed and stored in a variable-length row of a binary table.\n");
+ printf("If the [compress] qualifier is omitted, and the input image is\n");
+ printf("in tile-compressed format, then the output image will be uncompressed.\n");
+ printf("\n");
+ printf("If an extension name or number is appended to the input file name, \n");
+ printf("enclosed in square brackets, then only that single extension will be\n");
+ printf("copied to the output file.  Otherwise, every extension in the input file\n");
+ printf("will be processed in turn and copied to the output file.\n");
+ printf("\n");
+ printf("Examples:\n");
+ printf("\n");
+ printf("1)  imcopy image.fit 'cimage.fit[compress]'\n");
+ printf("\n");
+ printf("    This compresses the input image using the default parameters, i.e.,\n");
+ printf("    using the Rice compression algorithm and using row by row tiles.\n");
+ printf("\n");
+ printf("2)  imcopy cimage.fit image2.fit\n");
+ printf("\n");
+ printf("    This uncompresses the image created in the first example.\n");
+ printf("    image2.fit should be identical to image.fit if the image\n");
+ printf("    has an integer datatype.  There will be small differences\n");
+ printf("    in the pixel values if it is a floating point image.\n");
+ printf("\n");
+ printf("3)  imcopy image.fit 'cimage.fit[compress GZIP 100,100;q 16]'\n");
+ printf("\n");
+ printf("    This compresses the input image using the following parameters:\n");
+ printf("         GZIP compression algorithm;\n");
+ printf("         100 X 100 pixel compression tiles;\n");
+ printf("         quantization level = 16 (only used with floating point images)\n");
+ printf("\n");
+ printf("The full syntax of the compression qualifier is:\n");
+ printf("    [compress ALGORITHM TDIM1,TDIM2,...; q QLEVEL s SCALE]\n");
+ printf("where the allowed ALGORITHM values are:\n");
+ printf("      Rice, HCOMPRESS, HSCOMPRESS, GZIP, or PLIO. \n");
+ printf("       (HSCOMPRESS is a variant of HCOMPRESS in which a small\n");
+ printf("        amount of smoothing is applied to the uncompressed image\n");
+ printf("        to help suppress blocky compression artifacts in the image\n");
+ printf("        when using large values for the 'scale' parameter).\n");
+ printf("TDIMn is the size of the compression tile in each dimension,\n");
+ printf("\n");
+ printf("QLEVEL specifies the quantization level when converting a floating\n");
+ printf("point image into integers, prior to compressing the image.  The\n");
+ printf("default value = 16, which means the image will be quantized into\n");
+ printf("integer levels that are spaced at intervals of sigma/16., where \n");
+ printf("sigma is the estimated noise level in background areas of the image.\n");
+ printf("If QLEVEL is negative, this means use the absolute value for the\n");
+ printf("quantization spacing (e.g. 'q -0.005' means quantize the floating\n");
+ printf("point image such that the scaled integers represent steps of 0.005\n");
+ printf("in the original image).\n");
+ printf("\n");
+ printf("SCALE is the integer scale factor that only applies to the HCOMPRESS\n");
+ printf("algorithm.  The default value SCALE = 0 forces the image to be\n");
+ printf("losslessly compressed; Greater amounts of lossy compression (resulting\n");
+ printf("in smaller compressed files) can be specified with larger SCALE values.\n");
+ printf("\n");
+ printf("\n");
+ printf("Note that it may be necessary to enclose the file names\n");
+ printf("in single quote characters on the Unix command line.\n");
+      return(0);
+    }
+
+    /* Open the input file and create output file */
+    fits_open_file(&infptr, argv[1], READONLY, &status);
+    fits_create_file(&outfptr, "-", &status);
+    //fits_create_file(&outfptr, argv[2], &status);
+
+    if (status != 0) {    
+        fits_report_error(stderr, status);
+        return(status);
+    }
+
+    fits_get_hdu_num(infptr, &hdupos);  /* Get the current HDU position */
+
+    /* Copy only a single HDU if a specific extension was given */ 
+    if (hdupos != 1 || strchr(argv[1], '[')) single = 1;
+
+    for (; !status; hdupos++)  /* Main loop through each extension */
+    {
+
+      fits_get_hdu_type(infptr, &hdutype, &status);
+
+      if (hdutype == IMAGE_HDU) {
+
+          /* get image dimensions and total number of pixels in image */
+          for (ii = 0; ii < 9; ii++)
+              naxes[ii] = 1;
+
+          fits_get_img_param(infptr, 9, &bitpix, &naxis, naxes, &status);
+
+          totpix = naxes[0] * naxes[1] * naxes[2] * naxes[3] * naxes[4]
+             * naxes[5] * naxes[6] * naxes[7] * naxes[8];
+
+         //printf("totpix: %d\n", totpix );
+
+      }
+
+      if (hdutype != IMAGE_HDU || naxis == 0 || totpix == 0) { 
+
+          /* just copy tables and null images */
+          fits_copy_hdu(infptr, outfptr, 0, &status);
+
+      } else {
+
+          /* Explicitly create new image, to support compression */
+          fits_create_img(outfptr, bitpix, naxis, naxes, &status);
+          if (status) {
+                 fits_report_error(stderr, status);
+                 return(status);
+          }
+
+          if (fits_is_compressed_image(outfptr, &status)) {
+              /* write default EXTNAME keyword if it doesn't already exist */
+	      tstatus = 0;
+              fits_read_card(infptr, "EXTNAME", card, &tstatus);
+	      if (tstatus) {
+	         strcpy(card, "EXTNAME = 'COMPRESSED_IMAGE'   / name of this binary table extension");
+	         fits_write_record(outfptr, card, &status);
+	      }
+          }
+	  	    
+          /* copy all the user keywords (not the structural keywords) */
+          fits_get_hdrspace(infptr, &nkeys, NULL, &status); 
+
+          for (ii = 1; ii <= nkeys; ii++) {
+              fits_read_record(infptr, ii, card, &status);
+              if (fits_get_keyclass(card) > TYP_CMPRS_KEY)
+                  fits_write_record(outfptr, card, &status);
+          }
+
+         /*/////////////////////
+
+         int mynkeys;
+          fits_get_hdrspace(infptr, &mynkeys, NULL, &status); 
+         //printf("output nkeys: %d\n", mynkeys);
+
+          char newcard[FLEN_CARD];
+          for(ii=1; ii<=80; ii++) newcard[ii-1] = ' ';
+          int keytype;
+          for (ii = 1; ii <= mynkeys; ii++) {
+              fits_read_record(infptr, ii, newcard, &status);
+              //fits_parse_template(newcard, card, &keytype, &status);
+              printf("%-80s",newcard);
+          }
+
+         char end[FLEN_CARD] = "END                                                                             ";
+         printf("%80s",end);
+
+         int nfill = (mynkeys + 1) % 32;
+
+         if(nfill != 0){
+            char empty[FLEN_CARD] = "                                                                                ";
+            for (ii = nfill+1; ii <= 32; ii++) {
+              printf("%80s",empty);
+            }
+         }
+
+         /////////////////////*/
+
+
+              /* delete default EXTNAME keyword if it exists */
+/*
+          if (!fits_is_compressed_image(outfptr, &status)) {
+	      tstatus = 0;
+              fits_read_key(outfptr, TSTRING, "EXTNAME", card, NULL, &tstatus);
+	      if (!tstatus) {
+	         if (strcmp(card, "COMPRESSED_IMAGE") == 0)
+	            fits_delete_key(outfptr, "EXTNAME", &status);
+	      }
+          }
+*/
+	  
+          switch(bitpix) {
+              case BYTE_IMG:
+                  datatype = TBYTE;
+                  break;
+              case SHORT_IMG:
+                  datatype = TSHORT;
+                  break;
+              case LONG_IMG:
+                  datatype = TINT;
+                  break;
+              case FLOAT_IMG:
+                  datatype = TFLOAT;
+                  break;
+              case DOUBLE_IMG:
+                  datatype = TDOUBLE;
+                  break;
+          }
+
+          bytepix = abs(bitpix) / 8;
+
+          npix = totpix;
+          iteration = 0;
+
+          /* try to allocate memory for the entire image */
+          /* use double type to force memory alignment */
+          array = (double *) calloc(npix, bytepix);
+
+          /* if allocation failed, divide size by 2 and try again */
+          while (!array && iteration < 10)  {
+              iteration++;
+              npix = npix / 2;
+              array = (double *) calloc(npix, bytepix);
+          }
+
+          if (!array)  {
+              printf("Memory allocation error\n");
+              return(0);
+          }
+
+          /* turn off any scaling so that we copy the raw pixel values */
+          fits_set_bscale(infptr,  bscale, bzero, &status);
+          fits_set_bscale(outfptr, bscale, bzero, &status);
+
+          char * chararray = (char*)array;
+          first = 1;
+          while (totpix > 0 && !status)
+          {
+             /* read all or part of image then write it back to the output file */
+             fits_read_img(infptr, datatype, first, npix, 
+                     &nulval, array, &anynul, &status);
+
+             /*//////////////////////////////////////////////////////////////////////
+            for(ii=0; ii<npix; ii++){
+               int jj;
+               for(jj=0; jj<bytepix; jj++) printf("%c", jj);// ((char*)&(array[ii]))[jj]);
+            }
+             //////////////////////////////////////////////////////////////////////*/
+
+             fits_write_img(outfptr, datatype, first, npix, array, &status);
+             totpix = totpix - npix;
+             first  = first  + npix;
+          }
+          /*/////////////////////////////////////////////////////
+          nfill = (totpix*bytepix) % (80*32);
+          if(nfill !=0 )
+            for(ii=nfill+1; ii<(80*32); ii++) printf("%c",'\0');
+          /////////////////////////////////////////////////////*/
+          free(array);
+      }
+
+      if (single) break;  /* quit if only copying a single HDU */
+      fits_movrel_hdu(infptr, 1, NULL, &status);  /* try to move to next HDU */
+    }
+
+    if (status == END_OF_FILE)  status = 0; /* Reset after normal error */
+
+    fits_close_file(outfptr,  &status);
+    fits_close_file(infptr, &status);
+
+    /* if error occurred, print out error message */
+    if (status)
+       fits_report_error(stderr, status);
+    return(status);
+}
diff --git a/data-access/engine/imcopy.c.orig b/data-access/engine/imcopy.c.orig
new file mode 100644
index 0000000000000000000000000000000000000000..253205d97c908521c67baf05d82e79542022bfde
--- /dev/null
+++ b/data-access/engine/imcopy.c.orig
@@ -0,0 +1,233 @@
+#include <string.h>
+#include <stdio.h>
+#include <stdlib.h>
+#include "fitsio.h"
+
+int main(int argc, char *argv[])
+{
+    fitsfile *infptr, *outfptr;   /* FITS file pointers defined in fitsio.h */
+    int status = 0, tstatus, ii = 1, iteration = 0, single = 0, hdupos;
+    int hdutype, bitpix, bytepix, naxis = 0, nkeys, datatype = 0, anynul;
+    long naxes[9] = {1, 1, 1, 1, 1, 1, 1, 1, 1};
+    long first, totpix = 0, npix;
+    double *array, bscale = 1.0, bzero = 0.0, nulval = 0.;
+    char card[81];
+
+    if (argc != 3)
+    {
+ printf("\n");
+ printf("Usage:  imcopy inputImage outputImage[compress]\n");
+ printf("\n");
+ printf("Copy an input image to an output image, optionally compressing\n");
+ printf("or uncompressing the image in the process.  If the [compress]\n");
+ printf("qualifier is appended to the output file name then the input image\n");
+ printf("will be compressed using the tile-compressed format.  In this format,\n");
+ printf("the image is divided into rectangular tiles and each tile of pixels\n");
+ printf("is compressed and stored in a variable-length row of a binary table.\n");
+ printf("If the [compress] qualifier is omitted, and the input image is\n");
+ printf("in tile-compressed format, then the output image will be uncompressed.\n");
+ printf("\n");
+ printf("If an extension name or number is appended to the input file name, \n");
+ printf("enclosed in square brackets, then only that single extension will be\n");
+ printf("copied to the output file.  Otherwise, every extension in the input file\n");
+ printf("will be processed in turn and copied to the output file.\n");
+ printf("\n");
+ printf("Examples:\n");
+ printf("\n");
+ printf("1)  imcopy image.fit 'cimage.fit[compress]'\n");
+ printf("\n");
+ printf("    This compresses the input image using the default parameters, i.e.,\n");
+ printf("    using the Rice compression algorithm and using row by row tiles.\n");
+ printf("\n");
+ printf("2)  imcopy cimage.fit image2.fit\n");
+ printf("\n");
+ printf("    This uncompresses the image created in the first example.\n");
+ printf("    image2.fit should be identical to image.fit if the image\n");
+ printf("    has an integer datatype.  There will be small differences\n");
+ printf("    in the pixel values if it is a floating point image.\n");
+ printf("\n");
+ printf("3)  imcopy image.fit 'cimage.fit[compress GZIP 100,100;q 16]'\n");
+ printf("\n");
+ printf("    This compresses the input image using the following parameters:\n");
+ printf("         GZIP compression algorithm;\n");
+ printf("         100 X 100 pixel compression tiles;\n");
+ printf("         quantization level = 16 (only used with floating point images)\n");
+ printf("\n");
+ printf("The full syntax of the compression qualifier is:\n");
+ printf("    [compress ALGORITHM TDIM1,TDIM2,...; q QLEVEL s SCALE]\n");
+ printf("where the allowed ALGORITHM values are:\n");
+ printf("      Rice, HCOMPRESS, HSCOMPRESS, GZIP, or PLIO. \n");
+ printf("       (HSCOMPRESS is a variant of HCOMPRESS in which a small\n");
+ printf("        amount of smoothing is applied to the uncompressed image\n");
+ printf("        to help suppress blocky compression artifacts in the image\n");
+ printf("        when using large values for the 'scale' parameter).\n");
+ printf("TDIMn is the size of the compression tile in each dimension,\n");
+ printf("\n");
+ printf("QLEVEL specifies the quantization level when converting a floating\n");
+ printf("point image into integers, prior to compressing the image.  The\n");
+ printf("default value = 16, which means the image will be quantized into\n");
+ printf("integer levels that are spaced at intervals of sigma/16., where \n");
+ printf("sigma is the estimated noise level in background areas of the image.\n");
+ printf("If QLEVEL is negative, this means use the absolute value for the\n");
+ printf("quantization spacing (e.g. 'q -0.005' means quantize the floating\n");
+ printf("point image such that the scaled integers represent steps of 0.005\n");
+ printf("in the original image).\n");
+ printf("\n");
+ printf("SCALE is the integer scale factor that only applies to the HCOMPRESS\n");
+ printf("algorithm.  The default value SCALE = 0 forces the image to be\n");
+ printf("losslessly compressed; Greater amounts of lossy compression (resulting\n");
+ printf("in smaller compressed files) can be specified with larger SCALE values.\n");
+ printf("\n");
+ printf("\n");
+ printf("Note that it may be necessary to enclose the file names\n");
+ printf("in single quote characters on the Unix command line.\n");
+      return(0);
+    }
+
+    /* Open the input file and create output file */
+    fits_open_file(&infptr, argv[1], READONLY, &status);
+    fits_create_file(&outfptr, argv[2], &status);
+
+    if (status != 0) {    
+        fits_report_error(stderr, status);
+        return(status);
+    }
+
+    fits_get_hdu_num(infptr, &hdupos);  /* Get the current HDU position */
+
+    /* Copy only a single HDU if a specific extension was given */ 
+    if (hdupos != 1 || strchr(argv[1], '[')) single = 1;
+
+    for (; !status; hdupos++)  /* Main loop through each extension */
+    {
+
+      fits_get_hdu_type(infptr, &hdutype, &status);
+
+      if (hdutype == IMAGE_HDU) {
+
+          /* get image dimensions and total number of pixels in image */
+          for (ii = 0; ii < 9; ii++)
+              naxes[ii] = 1;
+
+          fits_get_img_param(infptr, 9, &bitpix, &naxis, naxes, &status);
+
+          totpix = naxes[0] * naxes[1] * naxes[2] * naxes[3] * naxes[4]
+             * naxes[5] * naxes[6] * naxes[7] * naxes[8];
+      }
+
+      if (hdutype != IMAGE_HDU || naxis == 0 || totpix == 0) { 
+
+          /* just copy tables and null images */
+          fits_copy_hdu(infptr, outfptr, 0, &status);
+
+      } else {
+
+          /* Explicitly create new image, to support compression */
+          fits_create_img(outfptr, bitpix, naxis, naxes, &status);
+          if (status) {
+                 fits_report_error(stderr, status);
+                 return(status);
+          }
+
+          if (fits_is_compressed_image(outfptr, &status)) {
+              /* write default EXTNAME keyword if it doesn't already exist */
+	      tstatus = 0;
+              fits_read_card(infptr, "EXTNAME", card, &tstatus);
+	      if (tstatus) {
+	         strcpy(card, "EXTNAME = 'COMPRESSED_IMAGE'   / name of this binary table extension");
+	         fits_write_record(outfptr, card, &status);
+	      }
+          }
+	  	    
+          /* copy all the user keywords (not the structural keywords) */
+          fits_get_hdrspace(infptr, &nkeys, NULL, &status); 
+
+          for (ii = 1; ii <= nkeys; ii++) {
+              fits_read_record(infptr, ii, card, &status);
+              if (fits_get_keyclass(card) > TYP_CMPRS_KEY)
+                  fits_write_record(outfptr, card, &status);
+          }
+
+              /* delete default EXTNAME keyword if it exists */
+/*
+          if (!fits_is_compressed_image(outfptr, &status)) {
+	      tstatus = 0;
+              fits_read_key(outfptr, TSTRING, "EXTNAME", card, NULL, &tstatus);
+	      if (!tstatus) {
+	         if (strcmp(card, "COMPRESSED_IMAGE") == 0)
+	            fits_delete_key(outfptr, "EXTNAME", &status);
+	      }
+          }
+*/
+	  
+          switch(bitpix) {
+              case BYTE_IMG:
+                  datatype = TBYTE;
+                  break;
+              case SHORT_IMG:
+                  datatype = TSHORT;
+                  break;
+              case LONG_IMG:
+                  datatype = TINT;
+                  break;
+              case FLOAT_IMG:
+                  datatype = TFLOAT;
+                  break;
+              case DOUBLE_IMG:
+                  datatype = TDOUBLE;
+                  break;
+          }
+
+          bytepix = abs(bitpix) / 8;
+
+          npix = totpix;
+          iteration = 0;
+
+          /* try to allocate memory for the entire image */
+          /* use double type to force memory alignment */
+          array = (double *) calloc(npix, bytepix);
+
+          /* if allocation failed, divide size by 2 and try again */
+          while (!array && iteration < 10)  {
+              iteration++;
+              npix = npix / 2;
+              array = (double *) calloc(npix, bytepix);
+          }
+
+          if (!array)  {
+              printf("Memory allocation error\n");
+              return(0);
+          }
+
+          /* turn off any scaling so that we copy the raw pixel values */
+          fits_set_bscale(infptr,  bscale, bzero, &status);
+          fits_set_bscale(outfptr, bscale, bzero, &status);
+
+          first = 1;
+          while (totpix > 0 && !status)
+          {
+             /* read all or part of image then write it back to the output file */
+             fits_read_img(infptr, datatype, first, npix, 
+                     &nulval, array, &anynul, &status);
+
+             fits_write_img(outfptr, datatype, first, npix, array, &status);
+             totpix = totpix - npix;
+             first  = first  + npix;
+          }
+          free(array);
+      }
+
+      if (single) break;  /* quit if only copying a single HDU */
+      fits_movrel_hdu(infptr, 1, NULL, &status);  /* try to move to next HDU */
+    }
+
+    if (status == END_OF_FILE)  status = 0; /* Reset after normal error */
+
+    fits_close_file(outfptr,  &status);
+    fits_close_file(infptr, &status);
+
+    /* if error occurred, print out error message */
+    if (status)
+       fits_report_error(stderr, status);
+    return(status);
+}
diff --git a/data-access/engine/resources/Obsolete/GARR/Obsolete/datasets.conf.old-keynames b/data-access/engine/resources/Obsolete/GARR/Obsolete/datasets.conf.old-keynames
new file mode 100644
index 0000000000000000000000000000000000000000..777f53deaf5b8849c8015fbabd1c740db3d4f6c9
--- /dev/null
+++ b/data-access/engine/resources/Obsolete/GARR/Obsolete/datasets.conf.old-keynames
@@ -0,0 +1,18 @@
+
+
+logfilename=vlkb-datasets.log
+fitsdir=/mnt/ia2-vo.oats.inaf.it/vialactea-devel
+fitscutdir=/mnt/ia2-vo.oats.inaf.it/vialactea-devel/cutouts
+remotefitsdir=http://ia2-vo.oats.inaf.it/vialactea-devel
+obscore_publisher=ivo://ia2.inaf.it/vlkb/dsetdesc?
+obscore_access_format=application/fits
+
+
+dbms=postgresql
+host_name=127.0.0.1
+port=5432
+db_name=vialactea
+schema=datasets
+user_name=vladmin
+password=IA2lbt09
+
diff --git a/data-access/engine/resources/Obsolete/GARR/Obsolete/datasets.confdevel.old-keynames b/data-access/engine/resources/Obsolete/GARR/Obsolete/datasets.confdevel.old-keynames
new file mode 100644
index 0000000000000000000000000000000000000000..1fb5cba129b80aefc696fe439dd5f71e92a0c613
--- /dev/null
+++ b/data-access/engine/resources/Obsolete/GARR/Obsolete/datasets.confdevel.old-keynames
@@ -0,0 +1,19 @@
+
+
+logfilename=vlkb.log
+fitsdir=/mnt/ia2-vo.oats.inaf.it/vialactea-devel
+fitscutdir=/mnt/ia2-vo.oats.inaf.it/vialactea-devel/cutouts
+remotefitsdir=http://ia2-vo.oats.inaf.it/vialactea-devel
+obscore_publisher=ivo://ia2.inaf.it/vlkb/dsetdesc?
+obscore_access_format=application/fits
+
+
+host_name=127.0.0.1
+port=5432
+dbms=postgresql
+db_name=testvl
+schema=testdatasets
+user_name=testadmin
+password=IA2lbt09
+
+
diff --git a/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.conf.old-keynames b/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.conf.old-keynames
new file mode 100644
index 0000000000000000000000000000000000000000..4f641b6d588d85838024c42bc62c072f5b0e169d
--- /dev/null
+++ b/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.conf.old-keynames
@@ -0,0 +1,22 @@
+
+ivoid_authority=ia2.inaf.it
+
+
+logfilename=vlkb-datasets.log
+logdir=/tmp
+fitsdir=/srv/vlkb/surveys
+fitscutdir=/srv/vlkb/cutouts
+remotefitsdir=http://ia2-vo.oats.inaf.it/vialactea-devel
+obscore_publisher=ivo://ia2.inaf.it/vlkb/dsetdesc?
+obscore_access_format=application/fits
+
+
+dbms=postgresql
+host_name=pasquale.ia2.inaf.it
+port=5432
+db_name=vialactea
+schema=datasets
+user_name=vialactea
+password=ia2vlkb
+
+
diff --git a/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.confdevel-on-host-vlkb.old-keynames b/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.confdevel-on-host-vlkb.old-keynames
new file mode 100644
index 0000000000000000000000000000000000000000..560e8cfc459a846964e7284014e0de608547e44e
--- /dev/null
+++ b/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.confdevel-on-host-vlkb.old-keynames
@@ -0,0 +1,22 @@
+
+ivoid_authority=ia2.inaf.it
+
+
+logfilename=vlkb-datasets.log
+logdir=/tmp
+fitsdir=/srv/vlkb/devel/surveys
+fitscutdir=/srv/vlkb/devel/cutouts
+remotefitsdir=http://ia2-vo.oats.inaf.it/vialactea-devel
+obscore_publisher=ivo://ia2.inaf.it/vlkb/dsetdesc?
+obscore_access_format=application/fits
+
+
+dbms=postgresql
+host_name=pasquale.ia2.inaf.it
+port=5432
+db_name=vialacteadevel
+schema=datasetsdevel
+user_name=vialactea
+password=ia2vlkb
+
+
diff --git a/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.confdevel.old-keynames b/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.confdevel.old-keynames
new file mode 100644
index 0000000000000000000000000000000000000000..4f26ce062a163dd4121456109ff801820a94de23
--- /dev/null
+++ b/data-access/engine/resources/Obsolete/IA2/Obsolete/datasets.confdevel.old-keynames
@@ -0,0 +1,22 @@
+
+ivoid_authority=ia2.inaf.it
+
+
+logfilename=vlkb-datasets.log
+logdir=/tmp
+fitsdir=/srv/vlkb/devel/surveys
+fitscutdir=/srv/vlkb/devel/cutouts
+remotefitsdir=http://ia2-vo.oats.inaf.it/vialactea-devel
+obscore_publisher=ivo://ia2.inaf.it/vlkb/dsetdesc?
+obscore_access_format=application/fits
+
+
+dbms=postgresql
+host_name=127.0.0.1
+port=5432
+db_name=vialactea
+schema=datasets
+user_name=vialactea
+password=ia2vlkb
+
+
diff --git a/data-access/engine/resources/Obsolete/IA2/vlkbddevel.service b/data-access/engine/resources/Obsolete/IA2/vlkbddevel.service
new file mode 100644
index 0000000000000000000000000000000000000000..904352736f9a5c2e09b13268b86e9af46eac29aa
--- /dev/null
+++ b/data-access/engine/resources/Obsolete/IA2/vlkbddevel.service
@@ -0,0 +1,11 @@
+
+[Unit]
+Description=VLKB dataset service (devel)
+After=network.target rabbitmq-server.service
+[Service]
+Type=forking
+ExecStart=/usr/local/bin/vlkbddevel localhost 5672 vlkbdevel /etc/vlkb/datasets.confdevel
+ExecStop=/bin/kill -p $MAINPID
+[Install]
+WantedBy=multi-user.target
+
diff --git a/data-access/engine/resources/Obsolete/survey_populate.csv b/data-access/engine/resources/Obsolete/survey_populate.csv
new file mode 100644
index 0000000000000000000000000000000000000000..67981f2a2c216723a291bf24664ffac0953e7887
--- /dev/null
+++ b/data-access/engine/resources/Obsolete/survey_populate.csv
@@ -0,0 +1,100 @@
+survey_id,name,species,transition,rest_frequency,restf_fits_unit,velocity_fits_unit,storage_path,file_filter,description
+1,ExtMaps,distance,rm10,0,Hz,kpc,extinction_maps,%_rm10.fits,"Galactic Plane Extinction maps (Arab & Cambresy 2017) - 10 armin resolution using 2MASS"
+2,ExtMaps,distance,rm5,0,Hz,kpc,extinction_maps,%_rm5.fits,"Galactic Plane Extinction maps (Arab & Cambresy 2017) - 5 armin resolution using 2MASS"
+3,"Mopra GPS",12CO,1-0,115271000000,Hz,m.s**-1,MOPRA/12CO,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)"
+4,"Mopra GPS",13CO,1-0,110201000000,Hz,m.s**-1,MOPRA/13CO,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)"
+5,"Mopra GPS",C17O,1-0,112359000000,Hz,m.s**-1,MOPRA/C17O,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)"
+6,"Mopra GPS",C18O,1-0,109782000000,Hz,m.s**-1,MOPRA/C18O,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)"
+7,CHIMPS,C18O,3-2,329331000000,Hz,km.s**-1,CHIMPS,CHIMPS_C18O_%.fits,"JCMT CHIMPS Survey (Rigby et al. 2016, MNRAS 456, 2885)"
+8,CHIMPS,13CO,3-2,330588000000,Hz,km.s**-1,CHIMPS,CHIMPS_13CO_%.fits,"JCMT CHIMPS Survey (Rigby et al. 2016, MNRAS 456, 2885)"
+9,CHaMP,HCO+,1-0,89188520000,Hz,m.s**-1,CHaMP,%,"Mopra CHaMP Survey (Barnes et al. 2011, ApJS 196, 12)"
+10,HOPS,H2O,6-1-6_5-2-3,22235080000,Hz,m.s**-1,HOPS,G%-H2O-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)"
+11,HOPS,NH3,1-1_1-1,23694495487,Hz,m.s**-1,HOPS,G%-NH3-11-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)"
+12,HOPS,NH3,2-2_2-2,23722633335,Hz,m.s**-1,HOPS,G%-NH3-22-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)"
+13,GRS,13CO,1-0,110201000000,GHz,m.s**-1,FCRAO_13CO_1_0_GRS,%,"BU-FCRAO Galactic Ring Survey (Jackson et al. 2011, ApJS 163, 145)"
+14,MALT90,N2H+,1-0,93173773438,Hz,m.s**-1,MALT90/n2hp,G%_n2hp_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+15,MALT90,13CS,2-1,92494303000,Hz,m.s**-1,MALT90/13cs,G%_13cs_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+16,MALT90,H,41a,92034475000,Hz,m.s**-1,MALT90/h41a,G%_h41a_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+17,MALT90,CH3CN,51-41,91985316000,Hz,m.s**-1,MALT90/ch3cn,G%_ch3cn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+18,MALT90,HC3N,10-9,91199796000,Hz,m.s**-1,MALT90/hc3n,G%_hc3n_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+19,MALT90,13C34S,2-1,90926036000,Hz,m.s**-1,MALT90/13c34s,G%_13c34s_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+20,MALT90,HNC,1-0,90663570313,Hz,m.s**-1,MALT90/hnc,G%_hnc_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+21,MALT90,HC13CCN,10-9_9-8,90593059000,Hz,m.s**-1,MALT90/hc13ccn,G%_hc13ccn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+22,MALT90,HCO+,1-0,89188523438,Hz,m.s**-1,MALT90/hcop,G%_hcop_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+23,MALT90,HCN,1-0,88631843750,Hz,m.s**-1,MALT90/hcn,G%_hcn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+24,MALT90,HNCO,413-312,88239027000,Hz,m.s**-1,MALT90/hnco413,G%_hnco413_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+25,MALT90,HNCO,404-303,87925238888,Hz,m.s**-1,MALT90/hnco404,G%_hnco404_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+26,MALT90,C2H,1-0_3/2-1/2_2-1,87316925000,Hz,m.s**-1,MALT90/c2h,G%_c2h_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+27,MALT90,HN13C,1-0,87090850000,Hz,m.s**-1,MALT90/hn13c,G%_hn13c_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+28,MALT90,SiO,2-1,86847010000,Hz,m.s**-1,MALT90/sio,G%_sio_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+29,MALT90,H13CO+,1-0,86754330000,Hz,m.s**-1,MALT90/h13cop,G%_h13cop_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)"
+30,ThrUMMS,12CO,1-0,115271000000,Hz,m.s**-1,ThrUMMS,%.12co.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)"
+31,ThrUMMS,13CO,1-0,110201000000,Hz,m.s**-1,ThrUMMS,%.13co.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)"
+32,ThrUMMS,C18O,1-0,109782000000,Hz,m.s**-1,ThrUMMS,%.c18o.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)"
+33,ThrUMMS,CN,1-2-3_0-1-2,113491000000,Hz,m.s**-1,ThrUMMS,%.cn.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)"
+34,"NANTEN GPS",12CO,1-0,115271000000,Hz,m.s**-1,NANTEN,%,"NANTEN Galactic Plane Survey"
+35,OGS,12CO,1-0,115271000000,GHz,m.s**-1,EXFC,%12co%.fits,"Exeter-FCRAO Outer Galaxy Survey (Brunt et al. 2017)"
+36,OGS,13CO,1-0,110201000000,GHz,m.s**-1,EXFC,%13co%.fits,"Exeter-FCRAO Outer Galaxy Survey (Brunt et al. 2017)"
+37,COHRS,12CO,3-2,345796000000,Hz,km.s**-1,JCMT-HARP,%,"JCMT COHRS Survey (Dempsey et al. 2013, ApJS 209, 8)"
+38,VGPS,HI,"21 cm",1420405716,Hz,m.s**-1,HI_VGPS,%,"VLA HI Galactic Plane Survey (Stil et al. 2006, AJ 132, 1158)"
+39,CGPS,HI,"21 cm",1420406000,Hz,m.s**-1,HI_CGPS,%,"DRAO HI Canadian Galactic Plane Survey (Taylor et al. 1999, PASA 15, 56)"
+40,SGPS,HI,"21 cm",1420405718,Hz,m.s**-1,HI_SGPS,%,"ATCA HI Southern Galactic Plane Survey (McClure-Griffiths et al. 2001, ApJ 551, 394)"
+41,CORNISH,Continuum,"5 GHz",486166250000,Hz,,continuum/cornish,%,"VLA 5GHz northern Galactic Plane Survey (Hoare et al. 2012, PASP 124, 939)"
+42,"Hi-GAL Tiles",Continuum,"70 um",0,,,continuum/higal/blue,%_blue%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+43,"Hi-GAL Tiles",Continuum,"160 um",0,,,continuum/higal/red,%_red%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+44,"Hi-GAL Tiles",Continuum,"250 um",0,,,continuum/higal/PSW,%_PSW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+45,"Hi-GAL Tiles",Continuum,"350 um",0,,,continuum/higal/PMW,%_PMW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+46,"Hi-GAL Tiles",Continuum,"500 um",0,,,continuum/higal/PLW,%_PLW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+47,"Hi-GAL Mosaic",Continuum,"70 um",0,,,continuum/higal_mosaic,%_blue%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+48,"Hi-GAL Mosaic",Continuum,"160 um",0,,,continuum/higal_mosaic,%_red%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+49,"Hi-GAL Mosaic",Continuum,"250 um",0,,,continuum/higal_mosaic,%_PSW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+50,"Hi-GAL Mosaic",Continuum,"350 um",0,,,continuum/higal_mosaic,%_PMW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+51,"Hi-GAL Mosaic",Continuum,"500 um",0,,,continuum/higal_mosaic,%_PLW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"
+52,MIPSGAL,Continuum,"24 um",0,,,continuum/mipsgal,%,"Spitzer MIPS 24um Galactic Plane Survey (Carey et al. 2009, PASP 121, 76)"
+53,WISE,Continuum,"3.4 um",0,,,continuum/wise,%-w1-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"
+54,WISE,Continuum,"4.6 um",0,,,continuum/wise,%-w2-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"
+55,WISE,Continuum,"12 um",0,,,continuum/wise,%-w3-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"
+56,WISE,Continuum,"22 um",0,,,continuum/wise,%-w4-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"
+57,MAGPIS,Continuum,"20 cm",0,,,continuum/magpis,magpis.%.fits,"VLA 20cm Galactic Plane Survey (White, Becker & Helfand 2005, AJ 130, 586)"
+58,"ARO-FQS LowRes",12CO,1-0,115271000000,Hz,m.s**-1,ARO,map12co_fb2_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)"
+59,"ARO-FQS HighRes",12CO,1-0,115271000000,Hz,m.s**-1,ARO,map12co_fb1_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)"
+60,"ARO-FQS LowRes",13CO,1-0,110201000000,Hz,m.s**-1,ARO,map13co_fb2_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)"
+61,"ARO-FQS HighRes",13CO,1-0,110201000000,Hz,m.s**-1,ARO,map13co_fb1_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)"
+62,ATLASGAL,Continuum,"870 um",0,,,continuum/ATLASGAL,%.fits,"ESO-APEX Laboca Galactic Plane Survey (Schuller et al. 2009, A&A 504, 415)"
+63,"CSO BGPS",Continuum,"1.1 mm",0,,,continuum/BOLOCAMGPS,%.fits,"CSO Bolocam Galactic Plane Survey (Ginsburg et al. 2013, ApJS 208, 14)"
+64,"GLIMPSE 360",Continuum,"3.6 um",0,,,continuum/GLIMPSE/360/1.2_mosaics/corr,%I1.fits,"Spitzer GLIMPSE 360 Survey - Outer Galaxy Extension (Benjamin et al. 2003, PASP 115, 953)"
+65,"GLIMPSE 360",Continuum,"4.5 um",0,,,continuum/GLIMPSE/360/1.2_mosaics/corr,%I2.fits,"Spitzer GLIMPSE 360 Survey - Outer Galaxy Extension (Benjamin et al. 2003, PASP 115, 953)"
+66,"GLIMPSE 3D",Continuum,"3.6 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3¡ (Benjamin et al. 2003, PASP 115, 953)"
+67,"GLIMPSE 3D",Continuum,"4.5 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3¡ (Benjamin et al. 2003, PASP 115, 953)"
+68,"GLIMPSE 3D",Continuum,"5.8 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3¡ (Benjamin et al. 2003, PASP 115, 953)"
+69,"GLIMPSE 3D",Continuum,"8.0 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3¡ (Benjamin et al. 2003, PASP 115, 953)"
+70,"GLIMPSE I",Continuum,"3.6 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)"
+71,"GLIMPSE I",Continuum,"4.5 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)"
+72,"GLIMPSE I",Continuum,"5.8 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)"
+73,"GLIMPSE I",Continuum,"8.0 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)"
+74,"GLIMPSE II",Continuum,"3.6 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10¡ (Benjamin et al. 2003, PASP 115, 953)"
+75,"GLIMPSE II",Continuum,"4.5 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10¡ (Benjamin et al. 2003, PASP 115, 953)"
+76,"GLIMPSE II",Continuum,"5.8 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10¡ (Benjamin et al. 2003, PASP 115, 953)"
+77,"GLIMPSE II",Continuum,"8.0 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10¡ (Benjamin et al. 2003, PASP 115, 953)"
+78,THOR,"Continuum - 25\" res.","1.06 GHz",1060000000,Hz,m.s**-1,THOR/continuum/1060mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+79,THOR,"Continuum - 25\" res.","1.31 GHz",1310000000,Hz,m.s**-1,THOR/continuum/1310mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+80,THOR,"Continuum - 25\" res.","1.44 GHz",1440000000,Hz,m.s**-1,THOR/continuum/1440mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+81,THOR,"Continuum - 25\" res.","1.69 GHz",1690000000,Hz,m.s**-1,THOR/continuum/1690mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+82,THOR,"Continuum - 25\" res.","1.82 GHz",1820000000,Hz,m.s**-1,THOR/continuum/1820mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+83,THOR,"Continuum - 25\" res.","1.95 GHz",1950000000,Hz,m.s**-1,THOR/continuum/1950mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+84,THOR,"Continuum - Native Res.","1.06 GHz",1060000000,Hz,m.s**-1,THOR/continuum/fullres,%1060MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+85,THOR,"Continuum - Native Res.","1.31 GHz",1310000000,Hz,m.s**-1,THOR/continuum/fullres,%1310MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+86,THOR,"Continuum - Native Res.","1.44 GHz",1440000000,Hz,m.s**-1,THOR/continuum/fullres,%1440MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+87,THOR,"Continuum - Native Res.","1.69 GHz",1690000000,Hz,m.s**-1,THOR/continuum/fullres,%1690MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+88,THOR,"Continuum - Native Res.","1.82 GHz",1820000000,Hz,m.s**-1,THOR/continuum/fullres,%1820MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+89,THOR,"Continuum - Native Res.","1.95 GHz",1950000000,Hz,m.s**-1,THOR/continuum/fullres,%1950MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+90,THOR,HI,"21 cm",1420405752,Hz,m.s**-1,THOR/HI,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+91,THOR,OH,"1.612 GHz",1612553669,Hz,m.s**-1,THOR/OH,%1612mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+92,THOR,OH,"1.665-1.667 GHz",1665000000,Hz,m.s**-1,THOR/OH,%166%mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+93,THOR,OH,"1.720 GHz",1720000000,Hz,m.s**-1,THOR/OH,%1720mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)"
+94,SEDIGISM,13CO,2-1,220398680000,Hz,m.s**-1,SEDIGISM/13CO,%_13CO21_%.fits,"SEDIGISM (Schuller et al. 2017, A&A 601, 124)"
+95,SEDIGISM,C18O,2-1,219560360000,Hz,m.s**-1,SEDIGISM/C18O,%_C18O21_%.fits,"SEDIGISM (Schuller et al. 2017, A&A 601, 124)"
+96,FUGIN,12CO,1-0,115271203000,Hz,m.s**-1,FUGIN,%_12CO_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)"
+97,FUGIN,13CO,1-0,110201370000,Hz,m.s**-1,FUGIN,%_13CO_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)"
+98,FUGIN,C18O,1-0,109782182000,Hz,m.s**-1,FUGIN,%_C18O_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)"
+99,"NANTEN2 GPS",12CO,1-0,115271000000,Hz,m.s**-1,NANTEN2,%,"NANTEN2 Galactic Plane Survey"
diff --git a/data-access/engine/resources/Obsolete/survey_populate.csv.obscore b/data-access/engine/resources/Obsolete/survey_populate.csv.obscore
new file mode 100644
index 0000000000000000000000000000000000000000..cdac1c7e9fca2371f7aa3d38afac38e806e89e82
--- /dev/null
+++ b/data-access/engine/resources/Obsolete/survey_populate.csv.obscore
@@ -0,0 +1,100 @@
+survey_id,name,species,transition,rest_frequency,restf_fits_unit,velocity_fits_unit,storage_path,file_filter,description,dataproduct_type,calib_level,o_ucd,fitskey_facility_name,fitskey_instrument_name
+1,ExtMaps,distance,rm10,0,Hz,kpc,extinction_maps,%_rm10.fits,"Galactic Plane Extinction maps (Arab & Cambresy 2017) - 10 armin resolution using 2MASS","cube",2,"o-ucd","TELESCOP","INSTRUME"
+2,ExtMaps,distance,rm5,0,Hz,kpc,extinction_maps,%_rm5.fits,"Galactic Plane Extinction maps (Arab & Cambresy 2017) - 5 armin resolution using 2MASS",   "cube",2,"o-ucd","TELESCOP","INSTRUME"
+3,"Mopra GPS",12CO,1-0,115271000000,Hz,m.s**-1,MOPRA/12CO,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+4,"Mopra GPS",13CO,1-0,110201000000,Hz,m.s**-1,MOPRA/13CO,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+5,"Mopra GPS",C17O,1-0,112359000000,Hz,m.s**-1,MOPRA/C17O,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+6,"Mopra GPS",C18O,1-0,109782000000,Hz,m.s**-1,MOPRA/C18O,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+7,CHIMPS,C18O,3-2,329331000000,Hz,km.s**-1,CHIMPS,CHIMPS_C18O_%.fits,"JCMT CHIMPS Survey (Rigby et al. 2016, MNRAS 456, 2885)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+8,CHIMPS,13CO,3-2,330588000000,Hz,km.s**-1,CHIMPS,CHIMPS_13CO_%.fits,"JCMT CHIMPS Survey (Rigby et al. 2016, MNRAS 456, 2885)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+9,CHaMP,HCO+,1-0,89188520000,Hz,m.s**-1,CHaMP,%,"Mopra CHaMP Survey (Barnes et al. 2011, ApJS 196, 12)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+10,HOPS,H2O,6-1-6_5-2-3,22235080000,Hz,m.s**-1,HOPS,G%-H2O-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+11,HOPS,NH3,1-1_1-1,23694495487,Hz,m.s**-1,HOPS,G%-NH3-11-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+12,HOPS,NH3,2-2_2-2,23722633335,Hz,m.s**-1,HOPS,G%-NH3-22-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+13,GRS,13CO,1-0,110201000000,GHz,m.s**-1,FCRAO_13CO_1_0_GRS,%,"BU-FCRAO Galactic Ring Survey (Jackson et al. 2011, ApJS 163, 145)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+14,MALT90,N2H+,1-0,93173773438,Hz,m.s**-1,MALT90/n2hp,G%_n2hp_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+15,MALT90,13CS,2-1,92494303000,Hz,m.s**-1,MALT90/13cs,G%_13cs_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+16,MALT90,H,41a,92034475000,Hz,m.s**-1,MALT90/h41a,G%_h41a_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+17,MALT90,CH3CN,51-41,91985316000,Hz,m.s**-1,MALT90/ch3cn,G%_ch3cn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+18,MALT90,HC3N,10-9,91199796000,Hz,m.s**-1,MALT90/hc3n,G%_hc3n_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+19,MALT90,13C34S,2-1,90926036000,Hz,m.s**-1,MALT90/13c34s,G%_13c34s_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+20,MALT90,HNC,1-0,90663570313,Hz,m.s**-1,MALT90/hnc,G%_hnc_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+21,MALT90,HC13CCN,10-9_9-8,90593059000,Hz,m.s**-1,MALT90/hc13ccn,G%_hc13ccn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+22,MALT90,HCO+,1-0,89188523438,Hz,m.s**-1,MALT90/hcop,G%_hcop_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+23,MALT90,HCN,1-0,88631843750,Hz,m.s**-1,MALT90/hcn,G%_hcn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+24,MALT90,HNCO,413-312,88239027000,Hz,m.s**-1,MALT90/hnco413,G%_hnco413_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+25,MALT90,HNCO,404-303,87925238888,Hz,m.s**-1,MALT90/hnco404,G%_hnco404_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+26,MALT90,C2H,1-0_3/2-1/2_2-1,87316925000,Hz,m.s**-1,MALT90/c2h,G%_c2h_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+27,MALT90,HN13C,1-0,87090850000,Hz,m.s**-1,MALT90/hn13c,G%_hn13c_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+28,MALT90,SiO,2-1,86847010000,Hz,m.s**-1,MALT90/sio,G%_sio_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+29,MALT90,H13CO+,1-0,86754330000,Hz,m.s**-1,MALT90/h13cop,G%_h13cop_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+30,ThrUMMS,12CO,1-0,115271000000,Hz,m.s**-1,ThrUMMS,%.12co.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+31,ThrUMMS,13CO,1-0,110201000000,Hz,m.s**-1,ThrUMMS,%.13co.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+32,ThrUMMS,C18O,1-0,109782000000,Hz,m.s**-1,ThrUMMS,%.c18o.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+33,ThrUMMS,CN,1-2-3_0-1-2,113491000000,Hz,m.s**-1,ThrUMMS,%.cn.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+34,"NANTEN GPS",12CO,1-0,115271000000,Hz,m.s**-1,NANTEN,%,"NANTEN Galactic Plane Survey","cube",2,"o-ucd","TELESCOP","INSTRUME"
+35,OGS,12CO,1-0,115271000000,GHz,m.s**-1,EXFC,%12co%.fits,"Exeter-FCRAO Outer Galaxy Survey (Brunt et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+36,OGS,13CO,1-0,110201000000,GHz,m.s**-1,EXFC,%13co%.fits,"Exeter-FCRAO Outer Galaxy Survey (Brunt et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+37,COHRS,12CO,3-2,345796000000,Hz,km.s**-1,JCMT-HARP,%,"JCMT COHRS Survey (Dempsey et al. 2013, ApJS 209, 8)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+38,VGPS,HI,"21 cm",1420405716,Hz,m.s**-1,HI_VGPS,%,"VLA HI Galactic Plane Survey (Stil et al. 2006, AJ 132, 1158)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+39,CGPS,HI,"21 cm",1420406000,Hz,m.s**-1,HI_CGPS,%,"DRAO HI Canadian Galactic Plane Survey (Taylor et al. 1999, PASA 15, 56)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+40,SGPS,HI,"21 cm",1420405718,Hz,m.s**-1,HI_SGPS,%,"ATCA HI Southern Galactic Plane Survey (McClure-Griffiths et al. 2001, ApJ 551, 394)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+41,CORNISH,Continuum,"5 GHz",486166250000,Hz,,continuum/cornish,%,"VLA 5GHz northern Galactic Plane Survey (Hoare et al. 2012, PASP 124, 939)"                ,"image",2,"o-ucd","TELESCOP","INSTRUME"
+42,"Hi-GAL Tiles",Continuum,"70 um",0,,,continuum/higal/blue,%_blue%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  ,"image",2,"o-ucd","TELESCOP","INSTRUME"
+43,"Hi-GAL Tiles",Continuum,"160 um",0,,,continuum/higal/red,%_red%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  , "image",2,"o-ucd","TELESCOP","INSTRUME"
+44,"Hi-GAL Tiles",Continuum,"250 um",0,,,continuum/higal/PSW,%_PSW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  , "image",2,"o-ucd","TELESCOP","INSTRUME"
+45,"Hi-GAL Tiles",Continuum,"350 um",0,,,continuum/higal/PMW,%_PMW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  , "image",2,"o-ucd","TELESCOP","INSTRUME"
+46,"Hi-GAL Tiles",Continuum,"500 um",0,,,continuum/higal/PLW,%_PLW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  , "image",2,"o-ucd","TELESCOP","INSTRUME"
+47,"Hi-GAL Mosaic",Continuum,"70 um",0,,,continuum/higal_mosaic,%_blue%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+48,"Hi-GAL Mosaic",Continuum,"160 um",0,,,continuum/higal_mosaic,%_red%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+49,"Hi-GAL Mosaic",Continuum,"250 um",0,,,continuum/higal_mosaic,%_PSW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+50,"Hi-GAL Mosaic",Continuum,"350 um",0,,,continuum/higal_mosaic,%_PMW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+51,"Hi-GAL Mosaic",Continuum,"500 um",0,,,continuum/higal_mosaic,%_PLW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+52,MIPSGAL,Continuum,"24 um",0,,,continuum/mipsgal,%,"Spitzer MIPS 24um Galactic Plane Survey (Carey et al. 2009, PASP 121, 76)"                              , "image",2,"o-ucd","TELESCOP","INSTRUME"
+53,WISE,Continuum,"3.4 um",0,,,continuum/wise,%-w1-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"                                                 , "image",2,"o-ucd","TELESCOP","INSTRUME"
+54,WISE,Continuum,"4.6 um",0,,,continuum/wise,%-w2-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"                                                 , "image",2,"o-ucd","TELESCOP","INSTRUME"
+55,WISE,Continuum,"12 um",0,,,continuum/wise,%-w3-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"                                                  , "image",2,"o-ucd","TELESCOP","INSTRUME"
+56,WISE,Continuum,"22 um",0,,,continuum/wise,%-w4-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"                                                  , "image",2,"o-ucd","TELESCOP","INSTRUME"
+57,MAGPIS,Continuum,"20 cm",0,,,continuum/magpis,magpis.%.fits,"VLA 20cm Galactic Plane Survey (White, Becker & Helfand 2005, AJ 130, 586)"                    , "image",2,"o-ucd","TELESCOP","INSTRUME"
+58,"ARO-FQS LowRes",12CO,1-0,115271000000,Hz,m.s**-1,ARO,map12co_fb2_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)"     , "image",2,"o-ucd","TELESCOP","INSTRUME"
+59,"ARO-FQS HighRes",12CO,1-0,115271000000,Hz,m.s**-1,ARO,map12co_fb1_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)"    , "image",2,"o-ucd","TELESCOP","INSTRUME"
+60,"ARO-FQS LowRes",13CO,1-0,110201000000,Hz,m.s**-1,ARO,map13co_fb2_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+61,"ARO-FQS HighRes",13CO,1-0,110201000000,Hz,m.s**-1,ARO,map13co_fb1_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+62,ATLASGAL,Continuum,"870 um",0,,,continuum/ATLASGAL,%.fits,"ESO-APEX Laboca Galactic Plane Survey (Schuller et al. 2009, A&A 504, 415)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+63,"CSO BGPS",Continuum,"1.1 mm",0,,,continuum/BOLOCAMGPS,%.fits,"CSO Bolocam Galactic Plane Survey (Ginsburg et al. 2013, ApJS 208, 14)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+64,"GLIMPSE 360",Continuum,"3.6 um",0,,,continuum/GLIMPSE/360/1.2_mosaics/corr,%I1.fits,"Spitzer GLIMPSE 360 Survey - Outer Galaxy Extension (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+65,"GLIMPSE 360",Continuum,"4.5 um",0,,,continuum/GLIMPSE/360/1.2_mosaics/corr,%I2.fits,"Spitzer GLIMPSE 360 Survey - Outer Galaxy Extension (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+66,"GLIMPSE 3D",Continuum,"3.6 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3¡ (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+67,"GLIMPSE 3D",Continuum,"4.5 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3¡ (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+68,"GLIMPSE 3D",Continuum,"5.8 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3¡ (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+69,"GLIMPSE 3D",Continuum,"8.0 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3¡ (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+70,"GLIMPSE I",Continuum,"3.6 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+71,"GLIMPSE I",Continuum,"4.5 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+72,"GLIMPSE I",Continuum,"5.8 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+73,"GLIMPSE I",Continuum,"8.0 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+74,"GLIMPSE II",Continuum,"3.6 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10¡ (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+75,"GLIMPSE II",Continuum,"4.5 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10¡ (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+76,"GLIMPSE II",Continuum,"5.8 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10¡ (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+77,"GLIMPSE II",Continuum,"8.0 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10¡ (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+78,THOR,"Continuum - 25\" res.","1.06 GHz",1060000000,Hz,m.s**-1,THOR/continuum/1060mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+79,THOR,"Continuum - 25\" res.","1.31 GHz",1310000000,Hz,m.s**-1,THOR/continuum/1310mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+80,THOR,"Continuum - 25\" res.","1.44 GHz",1440000000,Hz,m.s**-1,THOR/continuum/1440mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+81,THOR,"Continuum - 25\" res.","1.69 GHz",1690000000,Hz,m.s**-1,THOR/continuum/1690mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+82,THOR,"Continuum - 25\" res.","1.82 GHz",1820000000,Hz,m.s**-1,THOR/continuum/1820mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+83,THOR,"Continuum - 25\" res.","1.95 GHz",1950000000,Hz,m.s**-1,THOR/continuum/1950mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+84,THOR,"Continuum - Native Res.","1.06 GHz",1060000000,Hz,m.s**-1,THOR/continuum/fullres,%1060MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+85,THOR,"Continuum - Native Res.","1.31 GHz",1310000000,Hz,m.s**-1,THOR/continuum/fullres,%1310MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+86,THOR,"Continuum - Native Res.","1.44 GHz",1440000000,Hz,m.s**-1,THOR/continuum/fullres,%1440MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+87,THOR,"Continuum - Native Res.","1.69 GHz",1690000000,Hz,m.s**-1,THOR/continuum/fullres,%1690MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+88,THOR,"Continuum - Native Res.","1.82 GHz",1820000000,Hz,m.s**-1,THOR/continuum/fullres,%1820MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+89,THOR,"Continuum - Native Res.","1.95 GHz",1950000000,Hz,m.s**-1,THOR/continuum/fullres,%1950MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME"
+90,THOR,HI,"21 cm",1420405752,Hz,m.s**-1,THOR/HI,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+91,THOR,OH,"1.612 GHz",1612553669,Hz,m.s**-1,THOR/OH,%1612mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+92,THOR,OH,"1.665-1.667 GHz",1665000000,Hz,m.s**-1,THOR/OH,%166%mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+93,THOR,OH,"1.720 GHz",1720000000,Hz,m.s**-1,THOR/OH,%1720mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+94,SEDIGISM,13CO,2-1,220398680000,Hz,m.s**-1,SEDIGISM/13CO,%_13CO21_%.fits,"SEDIGISM (Schuller et al. 2017, A&A 601, 124)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+95,SEDIGISM,C18O,2-1,219560360000,Hz,m.s**-1,SEDIGISM/C18O,%_C18O21_%.fits,"SEDIGISM (Schuller et al. 2017, A&A 601, 124)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+96,FUGIN,12CO,1-0,115271203000,Hz,m.s**-1,FUGIN,%_12CO_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+97,FUGIN,13CO,1-0,110201370000,Hz,m.s**-1,FUGIN,%_13CO_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+98,FUGIN,C18O,1-0,109782182000,Hz,m.s**-1,FUGIN,%_C18O_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)","cube",2,"o-ucd","TELESCOP","INSTRUME"
+99,"NANTEN2 GPS",12CO,1-0,115271000000,Hz,m.s**-1,NANTEN2,%,"NANTEN2 Galactic Plane Survey","cube",2,"o-ucd","TELESCOP","INSTRUME"
diff --git a/data-access/engine/resources/survey_populate.csv b/data-access/engine/resources/survey_populate.csv
new file mode 100644
index 0000000000000000000000000000000000000000..e5bfad1eb2b119a48edd0406b0352c75977027a4
--- /dev/null
+++ b/data-access/engine/resources/survey_populate.csv
@@ -0,0 +1,100 @@
+survey_id,name,species,transition,rest_frequency,restf_fits_unit,velocity_fits_unit,storage_path,file_filter,description,dataproduct_type,calib_level,o_ucd,fitskey_facility_name,fitskey_instrument_name,auth_policy
+1,ExtMaps,distance,rm10,0,Hz,kpc,extinction_maps,%_rm10.fits,"Galactic Plane Extinction maps (Arab & Cambresy 2017) - 10 armin resolution using 2MASS","cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+2,ExtMaps,distance,rm5,0,Hz,kpc,extinction_maps,%_rm5.fits,"Galactic Plane Extinction maps (Arab & Cambresy 2017) - 5 armin resolution using 2MASS",   "cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+3,"Mopra GPS",12CO,1-0,115271000000,Hz,m.s**-1,MOPRA/12CO,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+4,"Mopra GPS",13CO,1-0,110201000000,Hz,m.s**-1,MOPRA/13CO,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+5,"Mopra GPS",C17O,1-0,112359000000,Hz,m.s**-1,MOPRA/C17O,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+6,"Mopra GPS",C18O,1-0,109782000000,Hz,m.s**-1,MOPRA/C18O,%.fits,"Mopra Galactic Plane Survey (Burton et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+7,CHIMPS,C18O,3-2,329331000000,Hz,km.s**-1,CHIMPS,CHIMPS_C18O_%.fits,"JCMT CHIMPS Survey (Rigby et al. 2016, MNRAS 456, 2885)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+8,CHIMPS,13CO,3-2,330588000000,Hz,km.s**-1,CHIMPS,CHIMPS_13CO_%.fits,"JCMT CHIMPS Survey (Rigby et al. 2016, MNRAS 456, 2885)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+9,CHaMP,HCO+,1-0,89188520000,Hz,m.s**-1,CHaMP,%,"Mopra CHaMP Survey (Barnes et al. 2011, ApJS 196, 12)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+10,HOPS,H2O,6-1-6_5-2-3,22235080000,Hz,m.s**-1,HOPS,G%-H2O-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+11,HOPS,NH3,1-1_1-1,23694495487,Hz,m.s**-1,HOPS,G%-NH3-11-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+12,HOPS,NH3,2-2_2-2,23722633335,Hz,m.s**-1,HOPS,G%-NH3-22-cube.fits,"Mopra HOPS Survey (Walsh et al. 2011, MNRAS 416, 1764)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+13,GRS,13CO,1-0,110201000000,GHz,m.s**-1,FCRAO_13CO_1_0_GRS,%,"BU-FCRAO Galactic Ring Survey (Jackson et al. 2011, ApJS 163, 145)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+14,MALT90,N2H+,1-0,93173773438,Hz,m.s**-1,MALT90/n2hp,G%_n2hp_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+15,MALT90,13CS,2-1,92494303000,Hz,m.s**-1,MALT90/13cs,G%_13cs_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+16,MALT90,H,41a,92034475000,Hz,m.s**-1,MALT90/h41a,G%_h41a_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+17,MALT90,CH3CN,51-41,91985316000,Hz,m.s**-1,MALT90/ch3cn,G%_ch3cn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+18,MALT90,HC3N,10-9,91199796000,Hz,m.s**-1,MALT90/hc3n,G%_hc3n_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+19,MALT90,13C34S,2-1,90926036000,Hz,m.s**-1,MALT90/13c34s,G%_13c34s_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+20,MALT90,HNC,1-0,90663570313,Hz,m.s**-1,MALT90/hnc,G%_hnc_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+21,MALT90,HC13CCN,10-9_9-8,90593059000,Hz,m.s**-1,MALT90/hc13ccn,G%_hc13ccn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+22,MALT90,HCO+,1-0,89188523438,Hz,m.s**-1,MALT90/hcop,G%_hcop_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+23,MALT90,HCN,1-0,88631843750,Hz,m.s**-1,MALT90/hcn,G%_hcn_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+24,MALT90,HNCO,413-312,88239027000,Hz,m.s**-1,MALT90/hnco413,G%_hnco413_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+25,MALT90,HNCO,404-303,87925238888,Hz,m.s**-1,MALT90/hnco404,G%_hnco404_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+26,MALT90,C2H,1-0_3/2-1/2_2-1,87316925000,Hz,m.s**-1,MALT90/c2h,G%_c2h_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+27,MALT90,HN13C,1-0,87090850000,Hz,m.s**-1,MALT90/hn13c,G%_hn13c_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+28,MALT90,SiO,2-1,86847010000,Hz,m.s**-1,MALT90/sio,G%_sio_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+29,MALT90,H13CO+,1-0,86754330000,Hz,m.s**-1,MALT90/h13cop,G%_h13cop_%.fits,"Mopra MALT90 Survey (Jackson et al. 2013, PASA 30, 57)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+30,ThrUMMS,12CO,1-0,115271000000,Hz,m.s**-1,ThrUMMS,%.12co.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+31,ThrUMMS,13CO,1-0,110201000000,Hz,m.s**-1,ThrUMMS,%.13co.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+32,ThrUMMS,C18O,1-0,109782000000,Hz,m.s**-1,ThrUMMS,%.c18o.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+33,ThrUMMS,CN,1-2-3_0-1-2,113491000000,Hz,m.s**-1,ThrUMMS,%.cn.fits,"Mopra ThrUMMS Survey (Barnes et al. 2015, ApJ 812, 6)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+34,"NANTEN GPS",12CO,1-0,115271000000,Hz,m.s**-1,NANTEN,%.fits,"NANTEN Galactic Plane Survey","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+35,OGS,12CO,1-0,115271000000,GHz,m.s**-1,EXFC,%12co%.fits,"Exeter-FCRAO Outer Galaxy Survey (Brunt et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+36,OGS,13CO,1-0,110201000000,GHz,m.s**-1,EXFC,%13co%.fits,"Exeter-FCRAO Outer Galaxy Survey (Brunt et al. 2017)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+37,COHRS,12CO,3-2,345796000000,Hz,km.s**-1,JCMT-HARP,%,"JCMT COHRS Survey (Dempsey et al. 2013, ApJS 209, 8)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+38,VGPS,HI,"21 cm",1420405716,Hz,m.s**-1,HI_VGPS,%,"VLA HI Galactic Plane Survey (Stil et al. 2006, AJ 132, 1158)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+39,CGPS,HI,"21 cm",1420406000,Hz,m.s**-1,HI_CGPS,%,"DRAO HI Canadian Galactic Plane Survey (Taylor et al. 1999, PASA 15, 56)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+40,SGPS,HI,"21 cm",1420405718,Hz,m.s**-1,HI_SGPS,%,"ATCA HI Southern Galactic Plane Survey (McClure-Griffiths et al. 2001, ApJ 551, 394)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+41,CORNISH,Continuum,"5 GHz",486166250000,Hz,,continuum/cornish,%.fits,"VLA 5GHz northern Galactic Plane Survey (Hoare et al. 2012, PASP 124, 939)"                ,"image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+42,"Hi-GAL Tiles",Continuum,"70 um",0,,,continuum/higal/blue,%_blue%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  ,"image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+43,"Hi-GAL Tiles",Continuum,"160 um",0,,,continuum/higal/red,%_red%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+44,"Hi-GAL Tiles",Continuum,"250 um",0,,,continuum/higal/PSW,%_PSW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+45,"Hi-GAL Tiles",Continuum,"350 um",0,,,continuum/higal/PMW,%_PMW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+46,"Hi-GAL Tiles",Continuum,"500 um",0,,,continuum/higal/PLW,%_PLW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)"  , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+47,"Hi-GAL Mosaic",Continuum,"70 um",0,,,continuum/higal_mosaic,%_blue%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+48,"Hi-GAL Mosaic",Continuum,"160 um",0,,,continuum/higal_mosaic,%_red%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+49,"Hi-GAL Mosaic",Continuum,"250 um",0,,,continuum/higal_mosaic,%_PSW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+50,"Hi-GAL Mosaic",Continuum,"350 um",0,,,continuum/higal_mosaic,%_PMW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+51,"Hi-GAL Mosaic",Continuum,"500 um",0,,,continuum/higal_mosaic,%_PLW%,"Hi-GAL - Herschel infrared Galactic Plane Survey (Molinari et al. 2016, A&A 591, 149)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+52,MIPSGAL,Continuum,"24 um",0,,,continuum/mipsgal,%,"Spitzer MIPS 24um Galactic Plane Survey (Carey et al. 2009, PASP 121, 76)"                              , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+53,WISE,Continuum,"3.4 um",0,,,continuum/wise,%-w1-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"                                                 , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+54,WISE,Continuum,"4.6 um",0,,,continuum/wise,%-w2-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"                                                 , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+55,WISE,Continuum,"12 um",0,,,continuum/wise,%-w3-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"                                                  , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+56,WISE,Continuum,"22 um",0,,,continuum/wise,%-w4-%,"WISE All-sky Survey (Wright et al. 2010, AJ 140, 1868)"                                                  , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+57,MAGPIS,Continuum,"20 cm",0,,,continuum/magpis,magpis.%.fits,"VLA 20cm Galactic Plane Survey (White, Becker & Helfand 2005, AJ 130, 586)"                    , "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+58,"ARO-FQS LowRes",12CO,1-0,115271000000,Hz,m.s**-1,ARO,map12co_fb2_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)"     , "image",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+59,"ARO-FQS HighRes",12CO,1-0,115271000000,Hz,m.s**-1,ARO,map12co_fb1_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)"    , "image",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+60,"ARO-FQS LowRes",13CO,1-0,110201000000,Hz,m.s**-1,ARO,map13co_fb2_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)", "image",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+61,"ARO-FQS HighRes",13CO,1-0,110201000000,Hz,m.s**-1,ARO,map13co_fb1_%-cube.fits,"Arizona Radio Observatory 3rd Quadrant Survey (Benedettini et al. 2017)", "image",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+62,ATLASGAL,Continuum,"870 um",0,,,continuum/ATLASGAL,%.fits,"ESO-APEX Laboca Galactic Plane Survey (Schuller et al. 2009, A&A 504, 415)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+63,"CSO BGPS",Continuum,"1.1 mm",0,,,continuum/BOLOCAMGPS,%.fits,"CSO Bolocam Galactic Plane Survey (Ginsburg et al. 2013, ApJS 208, 14)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+64,"GLIMPSE 360",Continuum,"3.6 um",0,,,continuum/GLIMPSE/360/1.2_mosaics/corr,%I1.fits,"Spitzer GLIMPSE 360 Survey - Outer Galaxy Extension (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+65,"GLIMPSE 360",Continuum,"4.5 um",0,,,continuum/GLIMPSE/360/1.2_mosaics/corr,%I2.fits,"Spitzer GLIMPSE 360 Survey - Outer Galaxy Extension (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+66,"GLIMPSE 3D",Continuum,"3.6 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3° (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+67,"GLIMPSE 3D",Continuum,"4.5 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3° (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+68,"GLIMPSE 3D",Continuum,"5.8 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3° (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+69,"GLIMPSE 3D",Continuum,"8.0 um",0,,,continuum/GLIMPSE/3D/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE 3D Survey - Extension |b|<3° (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+70,"GLIMPSE I",Continuum,"3.6 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+71,"GLIMPSE I",Continuum,"4.5 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+72,"GLIMPSE I",Continuum,"5.8 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+73,"GLIMPSE I",Continuum,"8.0 um",0,,,continuum/GLIMPSE/I/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE Survey (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+74,"GLIMPSE II",Continuum,"3.6 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I1.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10° (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+75,"GLIMPSE II",Continuum,"4.5 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I2.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10° (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+76,"GLIMPSE II",Continuum,"5.8 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I3.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10° (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+77,"GLIMPSE II",Continuum,"8.0 um",0,,,continuum/GLIMPSE/II/1.2_mosaics_v3.5,%I4.fits,"Spitzer GLIMPSE II Survey - Extension |l|<10° (Benjamin et al. 2003, PASP 115, 953)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+78,THOR,"Continuum - 25\" res.","1.06 GHz",1060000000,Hz,m.s**-1,THOR/continuum/1060mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+79,THOR,"Continuum - 25\" res.","1.31 GHz",1310000000,Hz,m.s**-1,THOR/continuum/1310mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+80,THOR,"Continuum - 25\" res.","1.44 GHz",1440000000,Hz,m.s**-1,THOR/continuum/1440mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+81,THOR,"Continuum - 25\" res.","1.69 GHz",1690000000,Hz,m.s**-1,THOR/continuum/1690mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+82,THOR,"Continuum - 25\" res.","1.82 GHz",1820000000,Hz,m.s**-1,THOR/continuum/1820mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+83,THOR,"Continuum - 25\" res.","1.95 GHz",1950000000,Hz,m.s**-1,THOR/continuum/1950mhz,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+84,THOR,"Continuum - Native Res.","1.06 GHz",1060000000,Hz,m.s**-1,THOR/continuum/fullres,%1060MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+85,THOR,"Continuum - Native Res.","1.31 GHz",1310000000,Hz,m.s**-1,THOR/continuum/fullres,%1310MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+86,THOR,"Continuum - Native Res.","1.44 GHz",1440000000,Hz,m.s**-1,THOR/continuum/fullres,%1440MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+87,THOR,"Continuum - Native Res.","1.69 GHz",1690000000,Hz,m.s**-1,THOR/continuum/fullres,%1690MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+88,THOR,"Continuum - Native Res.","1.82 GHz",1820000000,Hz,m.s**-1,THOR/continuum/fullres,%1820MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+89,THOR,"Continuum - Native Res.","1.95 GHz",1950000000,Hz,m.s**-1,THOR/continuum/fullres,%1950MHz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)", "image",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+90,THOR,HI,"21 cm",1420405752,Hz,m.s**-1,THOR/HI,%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+91,THOR,OH,"1.612 GHz",1612553669,Hz,m.s**-1,THOR/OH,%1612mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+92,THOR,OH,"1.665-1.667 GHz",1665000000,Hz,m.s**-1,THOR/OH,%166%mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+93,THOR,OH,"1.720 GHz",1720000000,Hz,m.s**-1,THOR/OH,%1720mhz%.fits,"VLA THOR Survey Inner Galactic Plane (Beuther et al. 2016, A&A 595, 32)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+94,SEDIGISM,13CO,2-1,220398680000,Hz,m.s**-1,SEDIGISM/13CO,%_13CO21_%.fits,"SEDIGISM (Schuller et al. 2017, A&A 601, 124)","cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+95,SEDIGISM,C18O,2-1,219560360000,Hz,m.s**-1,SEDIGISM/C18O,%_C18O21_%.fits,"SEDIGISM (Schuller et al. 2017, A&A 601, 124)","cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
+96,FUGIN,12CO,1-0,115271203000,Hz,m.s**-1,FUGIN,%_12CO_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+97,FUGIN,13CO,1-0,110201370000,Hz,m.s**-1,FUGIN,%_13CO_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+98,FUGIN,C18O,1-0,109782182000,Hz,m.s**-1,FUGIN,%_C18O_%_cube.fits,"Nobeyama-45m Galactic Plane Survey (Umemoto et al. 2017, PASJ 69, 78)","cube",2,"o-ucd","TELESCOP","INSTRUME",PUBLIC
+99,"NANTEN2 GPS",12CO,1-0,115271000000,Hz,m.s**-1,NANTEN2,%,"NANTEN2 Galactic Plane Survey","cube",2,"o-ucd","TELESCOP","INSTRUME",PRIVATE
diff --git a/data-access/engine/src/Makefile b/data-access/engine/src/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..1444fa77e673ce350d9cbb8d83c9d81ad6be23ee
--- /dev/null
+++ b/data-access/engine/src/Makefile
@@ -0,0 +1,60 @@
+#================================================================================
+PACK_EXT    ?= rpm
+# FIXME best build pack on its own OS (deb -> Debian 11,  rpm -> CentOS 7.9)
+EXEC_NAME   ?= vlkb
+INSTALL_DIR  = /usr/local
+VERSION     ?= $(shell git describe)
+#================================================================================
+BIN_DIR  = bin
+DEB_ROOT = debbuild
+RPM_ROOT = rpmbuild
+#================================================================================
+
+.PHONY: all clean
+
+all: vlkb vlkb-obscore vlkbd
+
+.PHONY: vlkb vlkb-obscore vlkbd
+vlkb vlkb-obscore vlkbd:
+	make -C common
+	make -C $@
+	make  EXEC_NAME=$@ $(PACK_EXT)
+
+clean:
+	make -C vlkb clean
+	make -C vlkb-obscore clean
+	make -C vlkbd clean
+	make -C common clean
+
+
+
+.PHONY: rpm
+rpm:
+	mkdir -p $(RPM_ROOT)/{BUILD,BUILDROOT,RPMS,SOURCES,SPECS,SRPMS}
+	cp $(EXEC_NAME)/$(EXEC_NAME).spec $(RPM_ROOT)/SPECS
+	cp $(EXEC_NAME)/bin/$(EXEC_NAME) $(RPM_ROOT)/SOURCES
+	rpmbuild -bb --clean --define "_topdir $(shell pwd)/$(RPM_ROOT)"  --define "_prefix /usr/local" --define "version $(shell git describe | sed -r 's/-/./g')"  $(EXEC_NAME)/$(EXEC_NAME).spec
+	find $(RPM_ROOT)/RPMS/* -name '*.rpm' -print0 | xargs -0 cp -t .
+	rm -fr $(RPM_ROOT)
+
+
+.PHONY: deb
+deb: PREFIX=$(DEB_ROOT)/$(EXEC_NAME)/usr/local
+deb:
+	mkdir -p $(DEB_ROOT)/$(EXEC_NAME)/DEBIAN $(PREFIX)
+	mkdir -p $(PREFIX)/bin $(PREFIX)/etc/$(EXEC_NAME)
+	mkdir -p $(PREFIX)/share/doc/$(EXEC_NAME)
+	mkdir -p $(PREFIX)/share/man/man1
+	sed 's/Version:.*/Version: $(VERSION)/' $(EXEC_NAME)/$(EXEC_NAME).control > $(DEB_ROOT)/$(EXEC_NAME)/DEBIAN/control
+	echo "/usr/local/etc/$(EXEC_NAME)/datasets.conf" > $(DEB_ROOT)/$(EXEC_NAME)/DEBIAN/conffiles
+	cp $(EXEC_NAME)/bin/$(EXEC_NAME) $(PREFIX)/bin
+	cp $(EXEC_NAME)/$(EXEC_NAME).datasets.conf $(PREFIX)/etc/$(EXEC_NAME)/datasets.conf
+	cp $(EXEC_NAME)/$(EXEC_NAME).changelog.Debian $(PREFIX)/share/doc/$(EXEC_NAME)/changelog.Debian
+	cp $(EXEC_NAME)/$(EXEC_NAME).copyright $(PREFIX)/share/doc/$(EXEC_NAME)/copyright
+	cp $(EXEC_NAME)/$(EXEC_NAME).1 $(PREFIX)/share/man/man1/
+	gzip --best -n $(PREFIX)/share/man/man1/$(EXEC_NAME).1
+	gzip --best -n $(PREFIX)/share/doc/$(EXEC_NAME)/changelog.Debian
+	cd $(DEB_ROOT) && dpkg-deb --root-owner-group --build $(EXEC_NAME) && mv $(EXEC_NAME).deb ../$(EXEC_NAME)-$(VERSION).deb && cd -
+	rm -fr $(DEB_ROOT)
+
+
diff --git a/data-access/engine/src/common/Makefile b/data-access/engine/src/common/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..cfe69799a172e8a43c27e39558ff99d8bf4fc2f0
--- /dev/null
+++ b/data-access/engine/src/common/Makefile
@@ -0,0 +1,86 @@
+#================================================================================
+TARGET_NAME = libvlkbcommon.a
+VERSION ?= $(shell git describe)
+BUILD_ ?= $(shell LANG=us_US date; hostname)
+#================================================================================
+DEPS_DIR := ../../ext/nlohmann-json
+DEPS_INC := $(DEPS_DIR)/include
+#DEPS_LIB := $(DEPS_DIR)/lib
+#================================================================================
+INC_DIR=src include $(DEPS_INC) \
+	/usr/include/cfitsio \
+	/usr/include/postgresql
+#================================================================================
+CC=g++
+CXX_DEBUG_FLAGS=-g -DFDB_DEBUG
+CXX_RELEASE_FLAGS=-O2
+CXX_DEFAULT_FLAGS=-c -x c++ -std=c++11 -fPIC -Wall -Wextra -Wconversion -fno-common -DVERSIONSTR='"$(VERSION)"' -DBUILD='"$(BUILD_)"'
+INC_PARM=$(foreach d, $(INC_DIR), -I$d)
+#================================================================================
+SRC_DIR:=src
+OBJ_DIR:=obj
+LIB_DIR:=lib
+#================================================================================
+TARGET 		:= $(LIB_DIR)/$(TARGET_NAME)
+CPP_FILES 	:= $(wildcard $(SRC_DIR)/*.cpp)
+C_FILES 		:= $(wildcard $(SRC_DIR)/*.c)
+OBJ_FILES   := $(addprefix $(OBJ_DIR)/,$(notdir $(CPP_FILES:.cpp=.o)))
+C_OBJ_FILES := $(addprefix $(OBJ_DIR)/,$(notdir $(C_FILES:.c=.o)))
+# defer eval when objs were created 
+DEPS_OBJ_FILES = $(wildcard $(DEPS_DIR)/obj/*.o)
+#================================================================================
+NPROCS = $(shell grep -c 'processor' /proc/cpuinfo)
+MAKEFLAGS += -j$(NPROCS)
+#================================================================================
+.PHONY: all
+all : debug
+
+.PHONY: release
+release: CXXFLAGS+=$(CXX_RELEASE_FLAGS) $(CXX_DEFAULT_FLAGS)
+release: $(TARGET)
+
+.PHONY: debug
+debug: CXXFLAGS+=$(CXX_DEBUG_FLAGS) $(CXX_DEFAULT_FLAGS)
+debug: $(TARGET)
+
+
+$(TARGET) : makedir $(DEPS_OBJ_FILES) $(DEPS_LIB) $(OBJ_FILES) $(C_OBJ_FILES)
+	ar -rc $@ $(DEPS_OBJ_FILES) $(OBJ_FILES) $(C_OBJ_FILES)
+
+.PHONY : $(DEPS_LIB)
+$(DEPS_LIB):
+	make -C $(DEPS_DIR)
+
+$(OBJ_DIR)/%.o: $(SRC_DIR)/%.cpp
+	        $(CC) $(CXXFLAGS) $(INC_PARM) -o $@ $<
+
+$(OBJ_DIR)/%.o: $(SRC_DIR)/%.c
+	        $(CC) $(CXXFLAGS) $(INC_PARM) -o $@ $<
+
+
+.PHONY: makedir
+makedir:
+	-mkdir -p $(OBJ_DIR) $(LIB_DIR)
+
+
+.PHONY: clean_deps
+clean_deps:
+	#make -C $(DEPS_DIR) clean
+
+.PHONY: clean
+clean : clean_deps
+	-rm -fr $(OBJ_DIR) $(LIB_DIR)
+
+
+.PHONY: test
+test :
+	@tabs 20
+	@echo -e "TARGET_NAME:\t"  $(TARGET_NAME)
+	@echo -e "VERSION:\t"  $(VERSION)
+	@echo -e "CPP_FILES:\t"  $(CPP_FILES)
+	@echo -e "OBJ_FILES:\t"  $(OBJ_FILES)
+	@echo -e "C_FILES:\t"  $(C_FILES)
+	@echo -e "C_OBJ_FILES:\t"  $(C_OBJ_FILES)
+	@echo -e "INC_PARM:\t"  $(INC_PARM)
+
+
diff --git a/data-access/engine/src/common/include/ast4vl.hpp b/data-access/engine/src/common/include/ast4vl.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..46571ab47734b84cbd2969536bd8d14f8c9671de
--- /dev/null
+++ b/data-access/engine/src/common/include/ast4vl.hpp
@@ -0,0 +1,58 @@
+#ifndef AST4VL_HPP
+#define AST4VL_HPP
+
+
+#include <string>
+#include <vector>
+
+#include "cutout.hpp" // coordinate needed
+
+
+struct point2d {double lon; double lat;};
+
+struct Bounds
+{
+   std::string label;
+   std::string low_str;
+   std::string up_str;
+   std::string unit;
+   double low;
+   double up;
+   int naxis;
+};
+
+struct uint_bounds
+{
+   unsigned int pix1;
+   unsigned int pix2;
+};
+
+struct double_xy
+{
+   double x;
+   double y;
+};
+
+struct overlap_ranges
+{
+   int ov_code;
+   std::vector<double_xy> pixel_ranges;
+};
+
+
+std::ostream& operator<<( std::ostream & o, const point2d &a);
+std::ostream& operator<<( std::ostream &out, struct Bounds const& p);
+std::ostream& operator<<( std::ostream &out, struct uint_bounds const& p);
+std::ostream& operator<<( std::ostream &out, overlap_ranges const& p);
+
+
+
+
+std::vector<point2d> calc_skyvertices(std::string header, std::string skysys);
+
+std::vector<Bounds> calc_bounds(std::string header, std::string skysys_str, std::string specsys_str);
+
+std::vector<uint_bounds> calc_overlap(const std::string header, const coordinates coord, int& ov_code);
+
+#endif
+
diff --git a/data-access/engine/src/common/include/cutout.hpp b/data-access/engine/src/common/include/cutout.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..65216f85d61c4b6f860e54b4dc94c73f357e9ede
--- /dev/null
+++ b/data-access/engine/src/common/include/cutout.hpp
@@ -0,0 +1,137 @@
+#ifndef CUTOUT_HPP
+#define CUTOUT_HPP
+
+#include <string>
+#include <vector>
+#include <cstdint> // FIXME is for uintmax_t - find c++ equivalent
+
+
+enum class area {CIRCLE, RECT, RANGE, POLYGON};
+enum class skysystem  {NONE, GALACTIC, ICRS};
+enum class specsystem {NONE, VELO_LSRK, WAVE_Barycentric};
+enum class timesystem {NONE, MJD_UTC};
+
+
+/* SODA */
+
+struct circle
+{
+   double lon, lat, radius;
+};
+
+struct range
+{
+   double lon1, lon2, lat1, lat2;
+};
+
+struct polygon
+{
+   std::vector<double> lon;
+   std::vector<double> lat;
+};
+
+struct position
+{
+   skysystem sys;
+   area    shape;
+   circle  circ;
+   range   rng;
+   polygon poly;
+};
+
+const position pos_none = {.sys = skysystem::NONE, .shape = area::CIRCLE, .circ = {0, 0, 0}, .rng = {0,0,0,0}, .poly = {{},{}}};
+
+struct band
+{
+   specsystem sys;
+   double band_value[2];
+};
+
+const band band_none = {.sys = specsystem::NONE, .band_value={0,0}};
+
+struct time_axis
+{
+   timesystem sys;
+   double time_value[2];
+};
+
+const time_axis time_none = {.sys = timesystem::NONE, .time_value={0,0}};
+
+int min_pol_state(std::vector<std::string> pol);
+int max_pol_state(std::vector<std::string> pol);
+
+
+
+/* VLKB-legacy */
+
+struct coordinates
+{
+   coordinates();
+
+   skysystem skysys;               // mandatory: "GALACTIC"|"ICRS"
+   double lon_deg, lat_deg;        // mandatory
+   area  shape;                    // mandatory: CIRCLE dlon=dlat=2xRadius, RECT dlon may differ from dlat
+   double dlon_deg, dlat_deg;      // mandatory
+   std::vector<double> p_lon_deg;  // mandatory polygon arrry
+   std::vector<double> p_lat_deg;  // mandatory polygon array
+
+   specsystem specsys;             // mandatory NONE VELO-LSRK WAVE-Bary
+   double vl_kmps, vu_kmps;        // optional, check specsystem == NONE
+
+   timesystem timesys;
+   double time_value[2];           // time interval (MJD in UTC)
+
+   std::vector<std::string> pol;   // polarization states FIXME pol should be Set<enums>
+};
+
+ coordinates to_coordinates(const position pos, const band bnd, const time_axis time,
+       const std::vector<std::string> pol);
+
+/* cutout */
+
+struct nullvals_count_s
+{
+   double fill_ratio;
+   unsigned long long null_count;
+   unsigned long long total_count;
+};
+
+struct fits_card
+{
+   std::string key;
+   std::string value;
+   std::string comment;
+};
+
+struct cutout_res_s
+{
+   uintmax_t filesize;
+   std::string filename;
+   nullvals_count_s nullvals_count;
+};
+
+
+cutout_res_s do_cutout_file(
+      const std::string fits_pathname, unsigned int hdunum,
+      const position pos, const band bnd, const time_axis time, const std::vector<std::string> pol,
+      const bool count_null_values,
+      const std::vector<fits_card> extra_cards,
+      const std::string conf_fits_path,
+      const std::string conf_fits_cutpath);
+
+
+// used in mcutout : deprecated to be removed from api when mcutout updated
+uintmax_t cutout_file(
+      const std::string abs_fits_pathname, unsigned int hdunum,
+      const coordinates coord,
+      const std::string vlkbcutout_pathname,
+      const std::vector<fits_card> extra_cards);
+
+
+// utils
+
+std::string generate_cut_fitsname(std::string pubdid, unsigned int hdunum);
+std::string create_timestamp();
+
+#endif
+
diff --git a/data-access/engine/src/common/include/cutout_nljson.hpp b/data-access/engine/src/common/include/cutout_nljson.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..4a86e42b6a6a16d59b37718fa6ff3725c6db685d
--- /dev/null
+++ b/data-access/engine/src/common/include/cutout_nljson.hpp
@@ -0,0 +1,22 @@
+#ifndef CUTOUT_NLJSON_HPP
+#define CUTOUT_NLJSON_HPP
+
+#include "json.hpp"
+#include "cutout.hpp"
+
+using json = nlohmann::json;
+
+void from_json(const json& j, position& p);
+void from_json(const json& j, band& p);
+void from_json(const json& j, time_axis& p);
+
+void to_json(json& j, const nullvals_count_s& p);
+void to_json(json& j, const cutout_res_s& p);
+void to_json(json& j, const fits_card& p);
+void from_json(const json& j, fits_card& p);
+
+void to_json(json& j, const coordinates& p);
+void from_json(const json& j, coordinates& p);
+
+#endif
+
diff --git a/data-access/engine/src/common/include/cutout_ostream.hpp b/data-access/engine/src/common/include/cutout_ostream.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..164260d894dcff7d7e368b2321041f9e0704d9ed
--- /dev/null
+++ b/data-access/engine/src/common/include/cutout_ostream.hpp
@@ -0,0 +1,12 @@
+#ifndef CUTOUT_OSTREAM_HPP
+#define CUTOUT_OSTREAM_HPP
+
+#include "cutout.hpp"
+#include "mcutout.hpp"
+
+#include <iostream>
+#include <vector>
+
+std::ostream& operator<<( std::ostream &out, struct coordinates const& p);
+
+#endif
diff --git a/data-access/engine/src/common/include/fitsfiles.hpp b/data-access/engine/src/common/include/fitsfiles.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..500e364f9b4839f51c28989ecaf8356d37099407
--- /dev/null
+++ b/data-access/engine/src/common/include/fitsfiles.hpp
@@ -0,0 +1,65 @@
+
+#ifndef FITSFILES_HPP
+#define FITSFILES_HPP
+
+#include "cutout.hpp" // struct fits_card
+
+#include <set>
+#include <map>
+#include <vector>
+#include <string>
+
+namespace fitsfiles
+{
+   std::string cfitsio_errmsg(const char * filename, int line_num, int status);
+
+   // for db-ingestion
+
+   std::uintmax_t fileSize(std::string pathname);
+   std::vector<std::string> globVector(const std::string& pattern);
+ 
+   struct keys_by_type
+   {
+      std::set<std::string> strKeys;
+      std::set<std::string> uintKeys;
+      std::set<std::string> doubleKeys;
+   };
+
+   struct key_values_by_type
+   {
+      std::map<std::string, std::string>   strValues;
+      std::map<std::string, unsigned long> uintValues;
+      std::map<std::string, double>        doubleValues;
+   };
+
+   struct Hdu 
+   {
+      unsigned int m_hdunum;
+      std::string  m_header;
+      key_values_by_type key_values;
+   };
+
+   std::vector<Hdu> fname2hdrstr(std::string filename, unsigned int maxHduPos, const keys_by_type *keys = nullptr);
+
+
+   // for services
+
+   std::string read_header(std::string pathname, unsigned int hdunum);
+   void fits_hdu_cut(const std::string infile, const unsigned int hdunum,
+         const std::string outfile);
+   std::string append_card_if_not_in_header(std::string header, const std::vector<fits_card> additional_cards);
+
+   // for vlkb cmds
+
+   std::string read_card(const std::string pathname, unsigned int hdunum, const std::string keyname);
+   void        add_cards_if_missing(const std::string pathname, unsigned int hdunum, const std::vector<struct fits_card> cards);
+   int         mod_value(std::string filename, std::string token, std::string keyvalue);
+
+
+   double calc_nullvals(std::string pathname, unsigned int hdunum,
+         unsigned long long & null_cnt, unsigned long long & total_cnt);
+
+};
+
+#endif
+
diff --git a/data-access/engine/src/common/include/io.hpp b/data-access/engine/src/common/include/io.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..b4dc8582259fe5140a175cf647f72cac365c24c1
--- /dev/null
+++ b/data-access/engine/src/common/include/io.hpp
@@ -0,0 +1,16 @@
+#ifndef IO_HPP
+#define IO_HPP
+
+#include <iostream>
+#include <fstream>
+#include <string>
+
+
+extern std::ofstream LOG_STREAM;
+void LOG_open(const std::string log_dir, const std::string log_filename);
+void LOG_close();
+
+void LOG_trace(const std::string line);
+
+#endif
+
diff --git a/data-access/engine/src/common/include/json_request.hpp b/data-access/engine/src/common/include/json_request.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..178da92b37d9ee22c1f681446524d4449a430aac
--- /dev/null
+++ b/data-access/engine/src/common/include/json_request.hpp
@@ -0,0 +1,63 @@
+
+#include "cutout.hpp"
+#include "cutout_nljson.hpp"
+#include "mcutout.hpp"
+#include "mcutout_nljson.hpp"
+
+#include "json.hpp"
+#include <string>
+
+using json = nlohmann::json;
+
+/* All nlohmann-json exception are json::exception <- std::exception.
+ * So let them be caught by std::excpetion as 'Internal errors' in rpc-call's infinite loop,
+ * assuming all API syntactic errors were caught in servlet API parser */
+
+enum service {SEARCH, CUTOUT, MCUTOUT, MERGEF, MERGE1, MERGE2, MERGE3, SUBIMG};
+
+class json_request
+{
+   public:
+
+      json_request(std::string request_json);
+
+      bool is_search() {return m_service == SEARCH;}
+      bool is_cutout() {return m_service == CUTOUT;}
+      bool is_mcutout() {return m_service == MCUTOUT;}
+      bool is_mergefiles() {return m_service == MERGEF;}
+      bool is_mergefiles_common_header() {return m_service == MERGE1;}
+      bool is_mergefiles_reproject() {return m_service == MERGE2;}
+      bool is_mergefiles_add_reprojected() {return m_service == MERGE3;}
+      bool is_subimg() {return m_service == SUBIMG;}
+
+      std::string pubdid() {return m_jservice.at("pubdid");}
+
+      struct coordinates coordinates();
+
+      bool count_null_values(){return m_jservice.at("count_null_values");}
+
+      std::vector<struct cut_param_s> cut_params();
+
+      std::string merge_id() {return m_jservice.at("merge_id");}
+      std::string dimensionality() {return m_jservice.at("dimensionality");}
+      std::vector<std::string> files_to_merge() {return m_jservice.at("files_to_merge");}
+      std::string fitsfilename() {return m_jservice.at("fits_filename");}
+
+      /* SUBIMG */
+
+      std::string abs_subimg_pathname() {return m_jservice.at("subimg_filename");}
+      std::string img_pathname()    {return m_jservice.at("img_pathname");}
+      int         img_hdunum()      {return m_jservice.at("img_hdunum");}
+      std::vector<struct fits_card> extra_cards();
+
+      /* new: no coordinates instead separate pos band time pol */
+
+      position     get_pos()  { return (m_jservice.contains("pos")  ? (position)    m_jservice.at("pos")  : pos_none ); }
+      band         get_band() { return (m_jservice.contains("band") ? (band)        m_jservice.at("band") : band_none); }
+      time_axis        get_time() { return (m_jservice.contains("time") ? (time_axis)       m_jservice.at("time") : time_none); }
+      std::vector<std::string> get_pol();
+
+   private:
+      json m_jservice;
+      service m_service;
+};
diff --git a/data-access/engine/src/common/include/mcutout.hpp b/data-access/engine/src/common/include/mcutout.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..e4fdeb7b22da3c8451036bf73aef7db7c5e54fce
--- /dev/null
+++ b/data-access/engine/src/common/include/mcutout.hpp
@@ -0,0 +1,85 @@
+#ifndef MCUTOUT_HPP
+#define MCUTOUT_HPP
+
+#include "cutout.hpp" // coordinates fits_card structs needed
+#include "mcutout.hpp"
+
+#include <string>
+#include <vector>
+#include <cstdint> // FIXME is for uintmax_t - find c++ equivalent
+
+
+enum class content_type {FILENAME, BAD_REQUEST, SERVICE_ERROR};
+std::string to_string(content_type ss);
+
+struct cut_param_s
+{
+   std::string pubdid;
+   coordinates coord;
+   bool countNullVals;
+   /* resolver adds: */
+   std::string filename;
+   unsigned int hdunum;
+   std::vector<struct fits_card> cards;
+};
+
+struct cut_resp_s
+{
+   cut_param_s input;
+   content_type type;
+   std::string content;
+   /* FIXME misses countNullVals results */
+};
+
+struct mcutout_res_s
+{
+   uintmax_t filesize;
+   std::string tgz_filename;
+   std::vector<cut_resp_s> responses;
+};
+
+
+struct mcutout_res_s mcutout(std::vector<struct cut_param_s> cut_params,
+      const std::string fits_path, const std::string fits_cut_path);
+
+
+/* mergefiles split (enables to call reproject in parallel) */
+
+
+void xmergefiles_common_header(
+      const std::string merge_id,
+      const std::vector<std::string> fitsfiles,
+      const std::string dimensionality,
+      const std::string merge_dir,
+      const std::string result_dir);
+
+
+void xmergefiles_reproject(
+      const std::string merge_id,
+      const std::string fitsfilename,
+      const std::string dimensionality,
+      const std::string merge_dir,
+      const std::string result_dir);
+
+
+unsigned long xmergefiles_add_reprojected(
+      const std::string merge_id,
+      const std::string dimensionality,
+      const std::string merge_dir,
+      const std::string result_dir,
+      std::string& merged_file_pathname);
+
+
+/* legacy mergefiles (serial exec) */
+
+
+unsigned long xmergefiles(
+      const std::vector<std::string> fitsfile,
+      const std::string dimensionality,
+      const std::string merge_dir,
+      const std::string result_dir,
+      std::string& merged_file_pathname);
+
+
+#endif
+
diff --git a/data-access/engine/src/common/include/mcutout_nljson.hpp b/data-access/engine/src/common/include/mcutout_nljson.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..67c213037adcca570f504d0a3475e0f872ee18c1
--- /dev/null
+++ b/data-access/engine/src/common/include/mcutout_nljson.hpp
@@ -0,0 +1,16 @@
+#ifndef MCUTOUT_NLJSON_HPP
+#define MCUTOUT_NLJSON_HPP
+
+#include "json.hpp"
+#include "mcutout.hpp"
+
+using json = nlohmann::json;
+
+void to_json(json& j, const mcutout_res_s& p);
+void to_json(json& j, const cut_param_s& p);
+void to_json(json& j, const cut_resp_s& p);
+
+void from_json(const json& j, cut_param_s& p);
+void from_json(const json& j, cut_resp_s& p);
+
+#endif
diff --git a/data-access/engine/src/common/include/mcutout_ostream.hpp b/data-access/engine/src/common/include/mcutout_ostream.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..4f6569bf69af5403fabdc2397e93dfe47bd62f6b
--- /dev/null
+++ b/data-access/engine/src/common/include/mcutout_ostream.hpp
@@ -0,0 +1,13 @@
+#ifndef MCUTOUT_OSTREAM_HPP
+#define MCUTOUT_OSTREAM_HPP
+
+#include "cutout.hpp" // coordinates needed
+#include "mcutout.hpp"
+
+#include <iostream>
+#include <vector>
+
+std::ostream& operator<<( std::ostream &out, struct ::cut_param_s const& p);
+std::ostream& operator<<( std::ostream &out, struct cut_resp_s const& p);
+
+#endif
diff --git a/data-access/engine/src/common/include/my_assert.hpp b/data-access/engine/src/common/include/my_assert.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..7c926de5a77aa26fbff183d416dbb0323c8c3891
--- /dev/null
+++ b/data-access/engine/src/common/include/my_assert.hpp
@@ -0,0 +1,9 @@
+#ifndef MY_ASSERT_HPP
+#define MY_ASSERT_HPP
+
+#include <string>
+#include <stdexcept>
+
+void my_assert(bool statement, std::string src_filename, int line_no, std::string msg);
+
+#endif
diff --git a/data-access/engine/src/common/src/ast4vl.cpp b/data-access/engine/src/common/src/ast4vl.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..a0da6e352297aaa6960d3b101dcd9c3525e6d0b7
--- /dev/null
+++ b/data-access/engine/src/common/src/ast4vl.cpp
@@ -0,0 +1,164 @@
+
+
+#include "ast4vl.hpp"
+#include "ast_frameset.hpp"
+#include "io.hpp"
+#include "math.h" // round() needed
+#include "cutout_ostream.hpp" // coordinates needed
+
+#include <ostream>
+//#include <vector>
+
+using namespace std;
+
+
+std::ostream& operator << (std::ostream & o, const point2d &a)
+{
+         return o << "(" << a.lon << ", " << a.lat << ")" ;
+}
+
+
+
+std::ostream& operator<<( std::ostream &out, struct Bounds const& p)
+{
+      out << p.label <<" ["<< p.naxis <<"] " << "[" << p.unit  << "] " << p.low_str << " .. " << p.up_str;
+         return out;
+}
+
+
+
+std::ostream& operator<<( std::ostream &out, struct uint_bounds const& p)
+{
+      out << "(" << p.pix1 << ", " << p.pix2 << ")";
+         return out;
+}
+
+std::ostream& operator<<( std::ostream &out, overlap_ranges const& p)
+{
+   out << p.ov_code;
+   for(double_xy r : p.pixel_ranges) out  <<" ("<< r.x << ", " << r.y  << ")";
+
+   return out;
+}
+
+
+
+std::vector<point2d> calc_skyvertices(std::string header, std::string skysys)
+{
+   LOG_trace(__func__);
+
+   ast::frameset frm_set(header);
+   frm_set.set_skysystem(skysys);
+
+   vector<point2d> skyvert = frm_set.sky_vertices();
+
+   return skyvert;
+}
+
+
+std::vector<Bounds> calc_bounds(std::string header, std::string skysys, std::string specsys)
+{
+   LOG_trace(__func__);
+
+   ast::frameset frm_set(header);
+   frm_set.set_skysystem(skysys); 
+   if(frm_set.has_specframe())
+      frm_set.set_specsystem(specsys);
+
+   vector<Bounds> bounds_vec = frm_set.bounds();
+
+   return bounds_vec;
+}
+
+
+
+/*
+ * ASTlib manual p.604: H.2 Changes Introduced in V1.2
+ * ...
+ * 6. When a FrameSet is created from a set of FITS header cards (by reading from a FitsChan
+ *    using a “foreign” encoding), the base Frame of the resulting FrameSet now has its Domain
+ *    attribute set to “GRID”. This reflects the fact that this Frame represents FITS data grid
+ *    coordinates (equivalent to FITS pixel coordinates—see §7.13). Previously, this Domain
+ *    value was not set.
+ * ...
+ *
+ * p77: 7.13 Conventions for Domain Names
+ * ...
+ * GRID
+ * Identifies the instantaneous data grid used to store and handle data, together
+ * with an associated coordinate system. In this coordinate system, the first el-
+ * ement stored in an array of data always has a coordinate value of unity at its
+ * centre and all elements have unit extent. This applies to all dimensions.
+ * ...
+ * PIXEL
+ * Identifies an array of pixels and an associated pixel-based coordinate system
+ * which is related to the GRID coordinate system (above) simply by a shift of
+ * origin along each axis. This shift may be integral, fractional, positive, negative
+ * or zero. The data elements retain their unit extent along each axis.
+ * ...
+ * The GRID domain (which corresponds with the pixel-numbering convention
+ * used by FITS) is a special case of the PIXEL domain and avoids this uncertainty.
+ *
+ *
+ * FIXME add check : 'Domain ?= GRID' where assumption is made that first data point is centered at (1,1)
+ */
+std::vector<uint_bounds> calc_overlap(const std::string header, const coordinates coord, int& ov_code)
+{
+   LOG_trace(__func__);
+
+   ast::frameset frm_set(header);
+
+   LOG_STREAM << "INPUT coord: " << coord << endl;
+
+   frm_set.set(coord.skysys);
+
+   if(frm_set.has_specframe())
+      frm_set.set(coord.specsys);
+
+   if(frm_set.has_timeaxis())
+      frm_set.set(coord.timesys);
+
+   overlap_ranges pix_ranges = frm_set.overlap(coord);
+
+   ov_code = pix_ranges.ov_code;
+
+   LOG_STREAM << "ov-code & pix ranges[double]: " << pix_ranges << endl;
+
+   /* convert to uint */
+
+   vector<uint_bounds> uint_bounds_vec;
+
+   LOG_STREAM << "pix ranges[uint]:";
+   for(double_xy dbl_range : pix_ranges.pixel_ranges)
+   {
+      if(dbl_range.x < 0)
+         throw out_of_range(string{__FILE__} + ":" + to_string(__LINE__)
+               + " pixel axis from overlap x is negative " + to_string(dbl_range.x));
+
+      if(dbl_range.y < 0)
+         throw out_of_range(string{__FILE__} + ":" + to_string(__LINE__)
+               + " pixel axis from overlap y is negative " + to_string(dbl_range.y));
+
+      // FIXME review conversion double -> uint: result must be: 1 <= result <= NAXISn
+      // because NAXISn start with 1 (FITS standard) which corresponds to ASTlib's GRID-domain
+      // FitsChan uses GRID Domain for FITS-pixel coords
+      if(dbl_range.x <= dbl_range.y)
+      {
+         uint_bounds ui_range{round(dbl_range.x), /*round*/(dbl_range.y)};
+         uint_bounds_vec.push_back(ui_range);
+         LOG_STREAM << " " << ui_range;
+      }
+      else
+      {
+         uint_bounds ui_range{round(dbl_range.y), /*round*/(dbl_range.x)};
+         uint_bounds_vec.push_back(ui_range);
+         LOG_STREAM << " " << ui_range;
+      }
+
+   }
+   LOG_STREAM << endl;
+
+   return uint_bounds_vec;
+}
+
+
diff --git a/data-access/engine/src/common/src/ast_frameset.cpp b/data-access/engine/src/common/src/ast_frameset.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..f08299b46fdaa1dfc3ee3fd060c27d7117f826ae
--- /dev/null
+++ b/data-access/engine/src/common/src/ast_frameset.cpp
@@ -0,0 +1,1391 @@
+
+
+
+#include "ast_frameset.hpp"
+#include "io.hpp"
+#include "my_assert.hpp"
+
+#include <stdexcept>
+
+/* for alomost_equal(double,double,int)*/
+#include <cmath>
+#include <limits>
+#include <ostream>
+#include <array>
+#include <iterator>// begin() end()
+
+#include <math.h> // M_PI
+
+#define D2R (M_PI/180.0)
+#define R2D (180.0/M_PI)
+
+const int AXES_CNT{5};
+
+using namespace std;
+
+/* NOTE in memory management: */
+/* AST 13.9 : [astGetFrame] 'would return a pointer (not a copy) to the base Frame' */
+/* AST p.246: [astGetFrame] 'increments the RefCount attribute of the selected Frame by one.' */
+/* AST p195: [astAnnul] 'function also decrements the Object’s RefCount attribute by one' */
+
+string ast_status_string()
+{
+   switch(astStatus)
+   {
+      case AST__NODEF: return "AST__NODEF";
+      default: return to_string(astStatus);
+
+   }
+}
+
+string failed_with_status(const char * file, int line, string func)
+{
+   return string{file} + ":" + to_string(line) + " " + func + " failed with status: " + ast_status_string();
+}
+
+
+ast::frameset:: ~frameset()
+{
+   LOG_trace(__func__);
+   astClearStatus;
+   astEnd;
+   LOG_STREAM << "~framset desctructor finished" << endl;
+};
+
+ast::frameset::frameset(string header)
+   :m_NAXISn(AXES_CNT)
+   ,m_hdr_fs((AstFrameSet*)AST__NULL)
+   ,m_has_specframe{false}
+   ,m_has_stokes_axis{false}
+   ,m_has_time_axis{false}
+{
+   LOG_trace(__func__);
+
+   astClearStatus;
+   astBegin;
+
+   AstFitsChan * fchan = astFitsChan( NULL, NULL, " " );
+
+   astPutCards(fchan, header.c_str());
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astPutChards"));
+
+   const char * encoding = astGetC(fchan,"Encoding");
+   LOG_STREAM << __func__ << " : Encoding: " << (encoding == AST__NULL ? "NULL" : encoding) << endl;
+
+   int NAXIS;
+   astGetFitsI( fchan, "NAXIS", &NAXIS );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetFitsI(NAXIS)"));
+
+   if( (NAXIS > AXES_CNT) || (NAXIS < 0) )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"FITS NAXIS out or range : 1 .. " + to_string(AXES_CNT) ));
+
+   m_NAXIS = NAXIS;
+
+
+   int NAXISn[AXES_CNT];
+
+   int ix;
+   for( ix = 0; ix < NAXIS; ix++ )
+   {
+      char keyword[ 9 ];
+      sprintf( keyword, "NAXIS%d", ix + 1 );
+
+      if( !astGetFitsI( fchan, keyword, &(NAXISn[ix]) ) )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__, "astGetFitsI(" + string(keyword) + ") "));
+   }
+
+//   std::vector<int> naxis_arr(AXES_CNT);
+//   std::copy(std::begin(NAXISn), std::end(NAXISn), naxis_arr.begin());
+   std::copy(std::begin(NAXISn), std::end(NAXISn), m_NAXISn.begin());
+
+
+   astClear(fchan,"Card");// rewind channel to first card
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astClear(fchan)"));
+
+   AstObject * object = (AstObject*)astRead( fchan );
+   if ( (!astOK) || (object == AST__NULL))
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astReadCards"));
+
+
+   if( !object )
+   {
+      LOG_STREAM << "Failed to read an AST Object from header" << endl;
+   }
+   else if( !astIsAFrameSet( object ) )
+   {
+      log_warnings(fchan);;
+
+      const char * astclass = astGetC( object, "Class" );
+      if ( (!astOK) || (astclass == AST__NULL))
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC(Class)"));
+      else
+         throw invalid_argument(string{__FILE__}+":"+string{__LINE__}+": expected a FrameSet but read a " + string{astclass});
+   }
+   else
+   {
+      log_warnings(fchan);
+
+      m_hdr_fs = (AstFrameSet *) object;
+   }
+
+   astAnnul(fchan);
+
+   m_has_specframe = has_specframe();
+   LOG_STREAM << "m_has_specframe: " << boolalpha << m_has_specframe << endl;
+   if(m_has_specframe) set_spec_axis();
+
+   set_pol_time_axis();
+
+   assert_valid_state();
+
+   serialize(LOG_STREAM); LOG_STREAM << endl;
+}
+
+
+
+bool is_specdomain(AstFrame * frm)
+{
+   const char * val = astGetC(frm,"Domain");
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC( Domain )"));
+
+   string domain_val{val};
+
+   return ( (domain_val.compare("SPECTRUM") == 0) ||
+            (domain_val.compare("DSBSPECTRUM") == 0) );
+}
+
+
+
+/* frame : AstCmpFrame or AstFrame */
+bool ast::frameset::has_specframe(void)
+{
+   LOG_trace(__func__);
+
+   AstFrame * frame = (AstFrame*)astGetFrame(m_hdr_fs, AST__CURRENT);
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetFrame"));
+
+   AstCmpFrame * cmpframe = (AstCmpFrame*)frame;
+
+   AstCmpFrame * first  = (AstCmpFrame*)AST__NULL;
+   AstFrame * second = (AstFrame*)AST__NULL;
+
+   int cnt = 0;
+   do
+   {
+      int series,invfirst,invsecond;
+
+      astDecompose((AstMapping*)cmpframe,
+            (AstMapping**)&first,
+            (AstMapping**)&second,
+            &series,&invfirst,&invsecond);
+      if ( !astOK ) 
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astDecompose"));
+
+      AstFrame * frm = (AstFrame*)AST__NULL; 
+
+      if(second == AST__NULL)
+      {
+         frm = (AstFrame*)first;
+      }
+      else
+      {
+         frm = second;;
+      }
+
+      int isspec = astIsASpecFrame((AstFrame*)frm);
+      if(isspec && is_specdomain(frm)) return true;
+
+      cmpframe = first;
+
+      cnt++;
+
+   } while(second != AST__NULL);
+
+   return false;
+}
+
+
+void* ast::frameset::find_skyframe(void)
+{
+   LOG_trace(__func__);
+
+   AstFrame * frame = (AstFrame*)astGetFrame(m_hdr_fs, AST__CURRENT);
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetFrame"));
+
+   AstCmpFrame * cmpframe = (AstCmpFrame*)frame;
+
+   AstCmpFrame * first  = (AstCmpFrame*)AST__NULL;
+   AstFrame * second = (AstFrame*)AST__NULL;
+
+   int cnt = 0;
+   do
+   {
+      int series,invfirst,invsecond;
+
+      astDecompose((AstMapping*)cmpframe,
+            (AstMapping**)&first,
+            (AstMapping**)&second,
+            &series,&invfirst,&invsecond);
+      if ( !astOK ) 
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astDecompose"));
+
+      AstFrame * frm = (AstFrame*)AST__NULL; 
+
+      if(second == AST__NULL)
+      {
+         frm = (AstFrame*)first;
+      }
+      else
+      {
+         frm = second;;
+      }
+
+      int issky = astIsASkyFrame((AstFrame*)frm);
+      if(issky) return (void*) frm;
+
+      cmpframe = first;
+
+      cnt++;
+
+   } while(second != AST__NULL);
+
+   return nullptr;
+}
+
+
+
+
+bool ast::frameset::has_timeaxis(void)
+{
+   LOG_trace(__func__);
+   return m_has_time_axis;
+}
+
+
+
+/* FIXME what spectral types should be considered or look for sub-string SPECTRUM not full string match */
+/* Or we just say NOT SUPPORTED and print the domain name */ 
+void ast::frameset::set_spec_axis(void)
+{
+   LOG_trace(__func__);
+
+   int naxes = astGetI( m_hdr_fs, "Naxes" );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetI( Naxes )"));
+
+   LOG_STREAM << "m_NAXIS vs Naxes :" << m_NAXIS << " vs " << naxes << endl;
+
+
+   bool set_axis{false};
+   LOG_STREAM << "Domains in header FrameSet(CURRENT): ";
+   int axis;
+   for(axis=1; axis<(naxes+1);axis++)
+      //for(axis=1; axis<(m_NAXIS+1);axis++)
+   {
+      string domain{"Domain("+to_string(axis)+")"};
+      const char * val =  astGetC(m_hdr_fs,domain.c_str());
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC( " + domain + ")"));
+
+      string domain_val{val};
+      LOG_STREAM << " " << domain_val;
+      if((domain_val.compare("SPECTRUM") == 0) ||
+            (domain_val.compare("DSBSPECTRUM") == 0))
+      {
+         m_spec_axis = axis;
+         set_axis = true;
+         break;
+      }
+   }
+   LOG_STREAM << endl;
+
+   if(!set_axis)
+      my_assert(false, __FILE__,__LINE__,
+            ": set_spec_axis may be called only if spec_axis exist; m_has_frame: " + to_string(m_has_specframe));
+}
+
+
+
+void ast::frameset::set_pol_time_axis(void)
+{
+   LOG_trace(__func__);
+
+   int naxes = astGetI( m_hdr_fs, "Naxes" );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetI( Naxes )"));
+
+   LOG_STREAM << "m_NAXIS vs Naxes :" << m_NAXIS << " vs " << naxes << endl;
+
+   LOG_STREAM << "Domains/Symbols in header FrameSet(CURRENT): ";
+   int axis;
+   for(axis=1; axis<(naxes+1);axis++)
+   {
+      string domain{"Domain("+to_string(axis)+")"};
+      string symbol{"Symbol("+to_string(axis)+")"};
+      const char * c_domain_val =  astGetC(m_hdr_fs,domain.c_str());
+       if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC( " + domain + ")"));
+      const char * c_symbol_val =  astGetC(m_hdr_fs,symbol.c_str());
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC( " + symbol + ")"));
+
+      string domain_val{c_domain_val};
+      string symbol_val{c_symbol_val};
+      LOG_STREAM << " " << domain_val << "/" << symbol_val;
+      if((domain_val.find("STOKES") != string::npos) && (symbol_val.compare("STOKES") == 0))
+      {
+         m_stokes_axis = axis;
+         m_has_stokes_axis = true;
+         //break;
+      }
+      else if(domain_val.compare("TIME") == 0)
+      {
+         m_time_axis = axis;
+         m_has_time_axis = true;
+         //break;
+      }
+   }
+   LOG_STREAM << endl;
+}
+
+
+
+
+void ast::frameset::set(skysystem skysys)
+{
+   LOG_trace(__func__);
+   switch(skysys)
+   {
+      case skysystem::GALACTIC: set_skysystem("Galactic"); break;
+      case skysystem::ICRS:     set_skysystem("ICRS"); break;
+      case skysystem::NONE:     /* noop */ break;
+   }
+}
+
+
+
+void ast::frameset::set(specsystem specsys)
+{
+   LOG_trace(__func__);
+   if( m_has_specframe  )
+   {
+      /* p102: in CmpFrame attrib is set in _all_ frames which have it. FIXME why Unit(4) does not set to only axis=4 ? */
+      switch(specsys)
+      {
+         case specsystem::VELO_LSRK:        set_specsystem("System=VELO,StdOfRest=LSRK,Unit=km/s"); break;
+         case specsystem::WAVE_Barycentric: set_specsystem("System=WAVE,StdOfRest=Bary,Unit=m"); break;
+         case specsystem::NONE:             /* noop */ break;
+      }
+   }
+}
+
+
+
+void ast::frameset::set(timesystem timesys)
+{
+   LOG_trace(__func__);
+   if( m_has_time_axis )
+   {
+      switch(timesys)
+      {
+         case timesystem::MJD_UTC: set_timesystem("System=MJD,TimeScale=UTC"); break;
+         case timesystem::NONE:    /* noop */ break;
+      }
+   }
+}
+
+
+
+
+void ast::frameset::set_skysystem(string skysys_str)
+{
+   LOG_trace(__func__);
+
+   if(!skysys_str.empty())
+   {
+      //int status = 0;
+      astSet( m_hdr_fs, "System=%s", skysys_str.c_str());//, &status );
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astSet(skysystem " + skysys_str + ")"));
+   }
+
+   assert_valid_state();
+}
+
+
+
+void ast::frameset::set_specsystem(string specsys_str)
+{
+   LOG_trace(__func__);
+
+   if(!specsys_str.empty())
+   {
+      int status = 0;
+      astSet( m_hdr_fs, specsys_str.c_str(), &status );
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astSet(specsystem " + specsys_str + ")"));
+   }
+
+   assert_valid_state();
+}
+
+
+
+void ast::frameset::set_timesystem(string timesys_str)
+{
+   LOG_trace(string{__func__});
+
+   if(!timesys_str.empty())
+   {
+      int status = 0;
+      astSet( m_hdr_fs, timesys_str.c_str(), &status );
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astSet(timesystem " + timesys_str + ")"));
+   }
+
+   assert_valid_state();
+}
+
+
+
+
+
+AstRegion * ast::frameset::create_header_region(void)
+{
+   LOG_trace(__func__);
+
+   /* get PIXEL -> WCS-FRAME mapping
+    *
+    * FIXME see whether simpler with Invert(BASE<->CURRENT) */
+
+   AstFrame * wcsfrm = (AstFrame*)astGetFrame( m_hdr_fs, AST__CURRENT );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetFrame( CURRENT )"));
+
+   AstFrame * pixfrm = (AstFrame*)astGetFrame( m_hdr_fs, AST__BASE );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetFrame( BASE )"));
+
+   AstMapping * pix2wcs = (AstMapping*)astGetMapping( m_hdr_fs, AST__BASE, AST__CURRENT );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetMapping( BASE->CURRENT )"));
+
+
+   /* Create AstBox from NAXISn */
+
+   double p1[ AXES_CNT ];
+   double p2[ AXES_CNT ];
+
+   int ix;
+   for(ix = 0; ix < m_NAXIS; ix++)
+   {
+      p1[ix] = 0.5;
+      p2[ix] = 0.5 + m_NAXISn[ix];
+   }
+
+   AstBox * pixbox = astBox( pixfrm, 1, p1, p2, NULL, " " );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astBox( NAXISn )"));
+
+
+   /* map header's pixels to WCS-region */
+
+   AstRegion * wcsbox = (AstRegion*)astMapRegion( pixbox, pix2wcs, wcsfrm );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astMapRegion( pixbox -> wcsbox )"));
+
+   return wcsbox;
+}
+
+
+vector<Bounds> ast::frameset::bounds(void)
+{
+   LOG_trace(__func__);
+
+   AstRegion * wcsbox = create_header_region();
+
+   /* finally, get the bounds */
+
+   double low[ AXES_CNT ];
+   double up[ AXES_CNT ];
+
+   astGetRegionBounds(wcsbox, low, up);
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetRegionBounds"));
+
+   int ix;
+   for(ix=0; ix<m_NAXIS; ix++) LOG_STREAM << "Before /bounds(): " << R2D * low[ix] << " " << R2D * up[ix] << endl;
+
+   // FIXME use ? Bounds to return the bounds - do as in ast4vl.c::ast4vl_bouns_set() 
+
+   AstFrame * rgn = (AstFrame*)wcsbox;// FIXME
+
+   vector<Bounds> bounds_vec;
+
+   char attrib[ 9 ];// FIXME length
+   for( ix = 0; ix < m_NAXIS; ix++ )
+   {
+      /* read label */
+      sprintf( attrib, "Label(%d)", ix + 1 );
+      const char * lbl = astGetC( rgn, attrib );
+      if ( !astOK || (lbl == AST__NULL) )
+      {
+         // throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC( Label("+to_string(ix+1)+") )"));
+         lbl = "<n/a>";
+         astClearStatus;
+      }
+
+      const char * low_str = astFormat(rgn,ix+1,low[ ix ]);// ! AST manual p.239: returned pointer remains valid for 50 calls
+      if ( !astOK || (low_str == AST__NULL) )
+      {  
+         // throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC( Label("+to_string(ix+1)+") )"));
+         low_str = "<n/a>";
+         astClearStatus;
+      }
+      const char * up_str = astFormat(rgn,ix+1,up[ ix ]);// ! AST manual p.239: returned pointer remains valid for 50 calls
+      if ( !astOK || (up_str == AST__NULL) )
+      {  
+         // throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC( Label("+to_string(ix+1)+") )"));
+         up_str = "<n/a>";
+         astClearStatus;
+      }
+
+      /* read unit */
+      sprintf( attrib, "Unit(%d)", ix + 1 );
+      const char * unit = astGetC( rgn, attrib );
+      if ( !astOK || (unit == AST__NULL) )
+      {
+         // throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC( Unit("+to_string(ix+1)+") )"));
+         unit = "<n/a>";
+         astClearStatus;
+      }
+
+      /* colect returne-data */
+      string tlabel = string{lbl};
+      string tlow_str = string{low_str};
+      string tup_str  = string{up_str};
+      string tunit  = string{unit};
+
+      Bounds bounds{ tlabel, tlow_str, tup_str, tunit,   low[ix], up[ix], m_NAXISn[ix] };
+
+      bounds_vec.push_back(bounds);
+   }
+
+   return bounds_vec;
+}
+
+#if 0
+/* uses only NAXIS to create header region (however FITS header axes count can be more */
+overlap_ranges ast::frameset::overlap_in_wcs(coordinates coord)
+{
+   LOG_trace(__func__);
+
+   /* get all header bounds */
+
+   AstRegion * wcsbox = create_header_region();
+
+   double low[ AXES_CNT ];
+   double up[ AXES_CNT ];
+
+   astGetRegionBounds(wcsbox, low, up);
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetRegionBounds"));
+
+
+   /* overwrite bounds for sky-axis and spec axis if given by coord input */
+
+   int ix_lon = astGetI(m_hdr_fs,"LonAxis") - 1;
+   int ix_lat = astGetI(m_hdr_fs,"LatAxis") - 1;
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetI( Lon/LatAxis )"));
+
+   LOG_STREAM << "LonAxis LatAxis indeces (zero-based): " << to_string(ix_lon) << " " << to_string(ix_lat) << endl; 
+
+   low[ix_lon] = D2R * (coord.lon_deg - coord.dlon_deg/2.0); 
+   low[ix_lat] = D2R * (coord.lat_deg - coord.dlat_deg/2.0);
+   up[ix_lon]  = D2R * (coord.lon_deg + coord.dlon_deg/2.0); 
+   up[ix_lat]  = D2R * (coord.lat_deg + coord.dlat_deg/2.0);
+
+   if(m_has_specframe && (coord.specsys != specsystem::NONE))
+   {
+      /* FIXME assumes m_frm_fs is VELO and unit is km/s How to align - if set(specsystem not called) ? */
+
+      string unit_key{"Unit("+to_string(m_spec_axis)+")"};
+      const char * cunit = astGetC(m_hdr_fs, unit_key.c_str());
+      string unit(cunit);
+
+      string std_of_rest_key{"StdOfRest("+to_string(m_spec_axis)+")"};
+      const char * cstd_of_rest = astGetC(m_hdr_fs, std_of_rest_key.c_str());
+      string std_of_rest(cstd_of_rest);
+
+      LOG_STREAM << "SpectralAxisUnit:       " << unit << endl;
+      LOG_STREAM << "SpectralAxis StdOfRest: " << std_of_rest << endl;
+
+      low[m_spec_axis-1] = coord.vl_kmps;
+      up[m_spec_axis-1] = coord.vu_kmps;
+   }
+
+   LOG_STREAM << string{__func__} << ": create input coord-region (angles in rad): low-bnd up-bnd" << endl;
+   int ix;
+   for(ix=0; ix<m_NAXIS; ix++) LOG_STREAM << "AstBox rad: " << low[ix] << " " << up[ix] 
+      << " deg: " << R2D*low[ix] << " " << R2D*up[ix] << endl;
+
+   /* FIXME ignored coord.shape; add circle later : metters for ovelap-code (but not for cut, which is alwazs rect) */
+   AstRegion * crgn = (AstRegion *)astBox( wcsbox, 1, low, up , (AstRegion*)AST__NULL," ");
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astBox( coord )"));
+
+
+   /* overlap */
+
+   int ov_code = astOverlap( wcsbox, crgn );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astOverlap( header, caller-coord )"));
+
+
+   /* if overlap, calc pixel renages */
+
+   vector<double_xy> pix_ranges;
+
+   bool no_overlap = ((ov_code == 1) || (ov_code == 6));
+   if(!no_overlap)
+   {
+      AstCmpRegion * wcsOverlap = astCmpRegion( wcsbox, crgn, AST__AND, " ");
+      if ( !astOK || (wcsOverlap == AST__NULL) )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astCmpRegion( header AND coord )"));
+
+      AstFrame    * pixfrm;
+      AstMapping  * wcs2pix;
+      pixfrm = (AstFrame*)astGetFrame( m_hdr_fs, AST__BASE );
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetFrame( header-fs )"));
+
+      wcs2pix = (AstMapping*)astGetMapping( m_hdr_fs, AST__CURRENT, AST__BASE );
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetMapping( CURRENT->BASE )"));
+
+      AstCmpRegion * pixOverlap = (AstCmpRegion*)astMapRegion( wcsOverlap, wcs2pix, pixfrm );
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astMapRegion( wcsOverlap -> pixOverlap )"));
+
+      double lbnd[AXES_CNT];
+      double ubnd[AXES_CNT];
+      astGetRegionBounds(pixOverlap, lbnd, ubnd);
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetRegionBounds( pixOverlap )"));
+
+      LOG_STREAM << "Overlap in pixels: " << endl;
+      int ix;
+      for(ix=0; ix<m_NAXIS; ix++)
+      {
+         LOG_STREAM << "PIX " << lbnd[ix] << " .. " << ubnd[ix] << endl;
+         pix_ranges.push_back({lbnd[ix],ubnd[ix]});
+      }
+   }
+
+   return overlap_ranges{ov_code, pix_ranges };
+}
+#else
+void log_bounds(string prefix, int length, double low[], double up[])
+{
+   int ix;
+   LOG_STREAM << prefix << endl;
+   for(ix=0; ix<length; ix++) LOG_STREAM << prefix << low[ix] << " " << up[ix] 
+      << " deg: " << R2D*low[ix] << " " << R2D*up[ix] << endl;
+}
+
+
+
+
+void ast::frameset::set_bounds_to_rect(coordinates& coord)
+{
+   LOG_trace(__func__);
+
+   switch(coord.shape)
+   {
+      case area::CIRCLE:
+      case area::RECT:
+         /* both set already */
+         break;
+      case area::POLYGON:
+         {
+            my_assert(coord.p_lon_deg.size()==coord.p_lat_deg.size(),
+                  __FILE__,__LINE__,"coord::p_lon and p_lat sizes differ");
+
+            int npnt = coord.p_lon_deg.size();
+            int dim  = npnt;
+            double points[2][dim];
+            const double * pts = &(points[0][0]);
+
+            LOG_STREAM << "polygon ";
+            int ii;
+            for(ii = 0; ii<dim; ii++)
+            {
+               points[0][ii] = D2R * coord.p_lon_deg[ii];
+               points[1][ii] = D2R * coord.p_lat_deg[ii];
+
+               LOG_STREAM << "(" << R2D * points[0][ii] << ", " << R2D * points[1][ii] << ")";
+            }
+            LOG_STREAM << endl;
+
+            AstSkyFrame * sky_frm = (AstSkyFrame*)find_skyframe();
+
+            my_assert(sky_frm != nullptr, __FILE__,__LINE__,"sky frame not found in header frameset");
+
+            AstPolygon * astPoly = astPolygon(sky_frm, npnt, dim, pts, NULL, " ");
+            if ( !astOK )
+               throw runtime_error(failed_with_status(__FILE__,__LINE__,"astPolygon( npoints=" + to_string(npnt) + "...)"));
+
+            double lbnd[2];
+            double ubnd[2];
+            astGetRegionBounds( (AstRegion*)astPoly, lbnd, ubnd );
+            if ( !astOK )
+               throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetRegionBounds(astPolygon ...)"));
+            /* NOTE AST-manual:
+             * lbnd < ubnd -> ok, note: if axis has no lower/upper limit largest-negative/largest-positive value is returned
+             * lbnd = ubnd -> axis has one value
+             * lbnd > ubnd -> region has no extent on that axis
+             * lbnd = ubnd = AST__NULL -> bounds on that axis cannot be determined */
+
+            LOG_STREAM << "polygon bounds lon: " << R2D * lbnd[0] << " .. " << R2D * ubnd[0] << endl;
+            LOG_STREAM << "polygon bounds lat: " << R2D * lbnd[1] << " .. " << R2D * ubnd[1] << endl;
+
+            // (mis)use RECT in coord to store bounds:
+            coord.lon_deg =  R2D * (lbnd[0] + ubnd[0])/2.0;
+            coord.lat_deg =  R2D * (lbnd[1] + ubnd[1])/2.0;
+            coord.dlon_deg = R2D * (ubnd[0] - lbnd[0]);
+            coord.dlat_deg = R2D * (ubnd[1] - lbnd[1]);
+         }
+         break;
+
+      default:
+         my_assert(false, __FILE__,__LINE__,"coord::shape invalid");
+   }
+}
+
+
+
+
+/* calcs overlap in pixel coords */
+overlap_ranges ast::frameset::overlap(coordinates coord)
+{
+   LOG_trace(__func__);
+
+   /* ---------- get all header bounds ---------------------- */
+
+   /*   AstRegion * wcsbox = create_header_region(); 
+    *   NOTE: not used because it creates header-region based on NAXIS only, not Naxes:
+    *   FITS header defines number of axes as max of ( NAXIS,WCSAXIS and highest wcs-index in a wcs-key) */
+
+   AstFrame * wcsfrm = (AstFrame*)astGetFrame( m_hdr_fs, AST__CURRENT );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetFrame( CURRENT )"));
+
+   AstFrame * pixfrm = (AstFrame*)astGetFrame( m_hdr_fs, AST__BASE );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetFrame( BASE )"));
+
+   AstMapping * pix2wcs = (AstMapping*)astGetMapping( m_hdr_fs, AST__BASE, AST__CURRENT );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetMapping( BASE->CURRENT )"));
+
+
+   /* Create AstBox from NAXISn */
+
+   double p1[ AXES_CNT ];
+   double p2[ AXES_CNT ];
+
+   my_assert(m_NAXIS <= AXES_CNT, __FILE__,__LINE__,
+         "This build supports "  + to_string(AXES_CNT) + " axes, but NAXIS is bigger : " + to_string(m_NAXIS));
+
+   int ix;
+   for(ix = 0; ix < m_NAXIS; ix++)
+   {
+      p1[ix] = 0.5;
+      p2[ix] = 0.5 + m_NAXISn[ix];
+   }
+
+   int naxes = astGetI( m_hdr_fs, "Naxes" );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetI( Naxes )"));
+
+   LOG_STREAM << "Naxes / NAXIS : " << naxes << " / " << m_NAXIS << endl;
+
+   my_assert(naxes <= AXES_CNT, __FILE__,__LINE__,
+         "This build supports "  + to_string(AXES_CNT)
+         + " axes, but Naxes returned from astGetI(header-frameset, Naxes) is bigger : " + to_string(m_NAXIS));
+
+   for(ix = m_NAXIS; ix < naxes; ix++)
+   {
+      p1[ix] = 0.5;
+      p2[ix] = 0.5;
+   }
+
+   AstBox * pixbox = astBox( pixfrm, 1, p1, p2, NULL, " " );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astBox( NAXISn )"));
+
+
+   /* DEBUG only */
+   double xlbnd[AXES_CNT];
+   double xubnd[AXES_CNT];
+   astGetRegionBounds(pixbox, xlbnd, xubnd);
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetRegionBounds( pixbox )"));
+   log_bounds("XX BOUNDS header_pix: ", naxes, xlbnd, xubnd);
+   /* DEBUG only */
+
+
+   /* map header's pixels to WCS-region */
+
+   AstRegion * wcsbox = (AstRegion*)astMapRegion( pixbox, pix2wcs, wcsfrm );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astMapRegion( pixbox -> wcsbox )"));
+
+
+   /* ------------ get bounds of the header region -------------------------- */
+
+   double low[ AXES_CNT ];
+   double up[ AXES_CNT ];
+
+   astGetRegionBounds(wcsbox, low, up);
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetRegionBounds"));
+
+   log_bounds("XX BOUNDS header: ", naxes, low, up);
+
+
+   /* overwrite bounds for sky-axis and spec axis if given by coord input */
+
+   int ix_lon = astGetI(m_hdr_fs,"LonAxis") - 1;
+   int ix_lat = astGetI(m_hdr_fs,"LatAxis") - 1;
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetI( Lon/LatAxis )"));
+
+   LOG_STREAM << "LonAxis LatAxis indeces (zero-based): " << to_string(ix_lon) << " " << to_string(ix_lat) << endl; 
+
+   set_bounds_to_rect(coord);
+
+   low[ix_lon] = D2R * (coord.lon_deg - coord.dlon_deg/2.0);
+   low[ix_lat] = D2R * (coord.lat_deg - coord.dlat_deg/2.0);
+   up[ix_lon]  = D2R * (coord.lon_deg + coord.dlon_deg/2.0); 
+   up[ix_lat]  = D2R * (coord.lat_deg + coord.dlat_deg/2.0);
+
+   vector<double_xy> pix_ranges;
+
+   if(m_has_specframe && (coord.specsys != specsystem::NONE))
+   {
+      /* FIXME assumes m_frm_fs is VELO and unit is km/s How to align - if set(specsystem not called) ? */
+
+      string unit_key{"Unit("+to_string(m_spec_axis)+")"};
+      const char * cunit = astGetC(m_hdr_fs, unit_key.c_str());
+      string unit(cunit);
+
+      string std_of_rest_key{"StdOfRest("+to_string(m_spec_axis)+")"};
+      const char * cstd_of_rest = astGetC(m_hdr_fs, std_of_rest_key.c_str());
+      string std_of_rest(cstd_of_rest);
+
+      LOG_STREAM << "SpectralAxisUnit:       " << unit << endl;
+      LOG_STREAM << "SpectralAxis StdOfRest: " << std_of_rest << endl;
+
+      /* FIXME  ast-overlap computation breaks if bands do not overlap - suspect AST bug ? */
+      if((( up[m_spec_axis-1] < coord.vl_kmps) && ( up[m_spec_axis-1] < coord.vu_kmps)) ||
+           ((low[m_spec_axis-1] > coord.vl_kmps) && (low[m_spec_axis-1] > coord.vu_kmps)))
+      {
+         /* set values only to get correct debug print for client */
+         low[m_spec_axis-1] = coord.vl_kmps;
+         up [m_spec_axis-1] = coord.vu_kmps;
+         log_bounds("XX BOUNDS client: ", naxes, low, up);
+         LOG_STREAM << "XX BOUNDS no overlap in spectrum axis, returning ov_code=1" << endl;
+         return overlap_ranges{1, pix_ranges };
+      }
+      else // at least partial overlap -> cut coord to min/max of header values
+      {
+         /* FIXME if() is needed becuase if coord bounds bigger then header's -> overlap yields 1 - suspect bug in AST? */
+         if(low[m_spec_axis-1] < coord.vl_kmps)  low[m_spec_axis-1] = coord.vl_kmps;
+         if(up [m_spec_axis-1] > coord.vu_kmps)  up [m_spec_axis-1] = coord.vu_kmps;
+      }
+   }
+
+   if(m_has_time_axis && (coord.timesys != timesystem::NONE))
+   {
+      // FIXME see if() in spec bounds above: test verify and eventually removea comments
+      /*if(low[m_time_axis-1] < coord.time_value[0])*/  low[m_time_axis-1] = coord.time_value[0];
+      /*if(up [m_time_axis-1] > coord.time_value[1])*/  up [m_time_axis-1] = coord.time_value[1];
+   }
+
+   if(m_has_stokes_axis && (!coord.pol.empty()))
+   {
+      // FIXME implement properly ranges 1,...4 and -1,...-8 (correspond to ranges I...V and RR...YY respectively)
+      low[m_stokes_axis-1] = min_pol_state(coord.pol);
+      up [m_stokes_axis-1] = max_pol_state(coord.pol);
+   }
+
+   log_bounds("XX BOUNDS client: ", naxes, low, up);
+
+   /* FIXME ignored coord.shape; add circle later : metters for ovelap-code (but not for cut, which is always rect) */
+   AstRegion * crgn = (AstRegion *)astBox( wcsbox, 1, low, up , (AstRegion*)AST__NULL," ");
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astBox( coord )"));
+
+   astInvert(pix2wcs);
+   AstCmpRegion * crgn_pix = (AstCmpRegion*)astMapRegion( crgn, pix2wcs, pixfrm );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astMapRegion( client-region WCS -> PIX by header-mapping )"));
+
+   /* DBG only */
+   double cllow[ AXES_CNT ];
+   double clup[ AXES_CNT ];
+   astGetRegionBounds(crgn_pix, cllow, clup);
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetRegionBounds"));
+
+   log_bounds("XX BOUNDS client_pix: ", naxes, cllow, clup);
+   /* DBG only */
+
+
+   /* overlap */
+
+   /* FIXME returns "5"= exact macth however swapping coords returns seemingly correct overlap code ; bug ?
+    * FIXME TODO workround: if 5 returned -> swap regions and use that value
+    * accept 5 only if returned twice: original oreder and swapped both return 5 */
+   //int ov_code = astOverlap( crgn_pix, pixbox );
+   int ov_code = astOverlap( pixbox, crgn_pix );
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astOverlap( header, caller-coord )"));
+
+   /* if overlap, calc pixel ranges */
+
+
+   bool no_overlap = ((ov_code == 1) || (ov_code == 6));
+   if(!no_overlap)
+   {
+      AstCmpRegion * pixOverlap = astCmpRegion( pixbox, crgn_pix, AST__AND, " ");
+      if ( !astOK || (pixOverlap == AST__NULL) )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astCmpRegion( header AND coord )"));
+
+      double lbnd[AXES_CNT];
+      double ubnd[AXES_CNT];
+      astGetRegionBounds(pixOverlap, lbnd, ubnd);
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetRegionBounds( pixOverlap )"));
+
+      log_bounds("XX BOUNDS overlap_pix: ", m_NAXIS, lbnd, ubnd);
+
+      int ix;
+      for(ix=0; ix<m_NAXIS; ix++)
+      {
+         pix_ranges.push_back({lbnd[ix],ubnd[ix]});
+      }
+   }
+
+   return overlap_ranges{ov_code, pix_ranges };
+}
+#endif
+
+
+
+
+bool almost_equal(double x, double y, int ulp)
+{
+   // the machine epsilon has to be scaled to the magnitude of the values used
+   // and multiplied by the desired precision in ULPs (units in the last place)
+   return std::fabs(x-y) <= std::numeric_limits<double>::epsilon() * std::fabs(x+y) * ulp
+      // unless the result is subnormal
+
+      || std::fabs(x-y) < std::numeric_limits<double>::min();
+
+   /* also note std::numeric_limits<double>::epsilon() cannot be used with a value having a magnitude different of 1.0.
+    * Instead, std::nextafter can be used to get the epsilon of a value with any magnitude.
+    *  double nextafter (double x , double y );
+    *  returns the next representable value after x in the direction of y.
+    */
+}
+
+/* 2**N */
+int my_pow_int (int x, int p)
+{
+   int i = 1;
+   for (int j = 1; j <= p; j++)  i *= x;
+   return i;
+}
+
+
+std::vector<point2d> ast::frameset::sky_vertices(void)
+{
+   LOG_trace(__func__);
+
+   const int naxes = astGetI(m_hdr_fs, "Naxes");
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetI( Naxes )"));
+
+   my_assert(naxes >= m_NAXIS, __FILE__,__LINE__,"AST-naxes is less then FITS-NAXIS");
+
+   const int ncoord_in  = naxes;
+   const int ncoord_out = naxes; // FIXME for now we assume Nin = Nout
+
+
+   /* generate vertices in form (0.5, NAXISi+0.5) in 'naxes' dimensions */
+
+   /* NOTE 1: lon lat coordinates can be identified only in WCS-coordinate due to
+    * possible permutations and transformations (rotation), so in pixel-grid
+    * we need to create complete vertex-array for each dimension */
+
+   /* NOTE 2: this generates unordered set of vertices.
+    * If we could generate set of vertexes ordered by polar angle (from center of the 4-vertices)
+    * then polygons could be generated without later re-ordering.
+    * Is it possible at least in specific cases ? */
+
+   const int npoint = my_pow_int(2,m_NAXIS);
+
+   LOG_STREAM << "VERT Nin Nout: " << ncoord_in  << " " << ncoord_out << " npoint: " << npoint << endl;
+
+   double vecs_in [ncoord_in ][npoint];
+   double vecs_out[ncoord_out][npoint];
+
+   const double MIN_VAL = 0.5;
+
+   int ic,ip;
+   for(ip=0; ip < npoint; ip++)
+   {
+      for(ic=0; ic < ncoord_in; ic++)
+      {
+         const int bit = 1 << ic;
+
+         if(ic < m_NAXIS)
+            vecs_in[ic][ip] = (bit & ip) ? MIN_VAL : (MIN_VAL + m_NAXISn[ic]);
+         else
+            vecs_in[ic][ip] = (bit & ip) ? MIN_VAL : (MIN_VAL + 1);
+      }
+   }
+
+   for(ip=0; ip<npoint; ip++)
+   {
+      LOG_STREAM << "Pi[" << ip << "]: ";
+      for(ic=0; ic<ncoord_in; ic++)
+         LOG_STREAM << " " << vecs_in[ic][ip];
+      LOG_STREAM << endl;
+   }
+
+
+   /* transform the generated vertices in pixels to WCS and identify lon, lat axis */
+
+   double * ptr_in [ncoord_in ];
+   double * ptr_out[ncoord_out];
+
+   int ix;
+   for(ix=0; ix<ncoord_in ; ix++) ptr_in [ix] = &(vecs_in [ix][0]);
+   for(ix=0; ix<ncoord_out; ix++) ptr_out[ix] = &(vecs_out[ix][0]);
+
+
+   astTranP(m_hdr_fs, npoint, ncoord_in, (const double**)ptr_in, 1, ncoord_out, ptr_out);
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astTranP(pix -> wcs vertices)"));
+
+
+   const int ixlon = astGetI(m_hdr_fs,"LonAxis") - 1;
+   const int ixlat = astGetI(m_hdr_fs,"LatAxis") - 1;
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetI( Lon/LatAxis )"));
+
+   LOG_STREAM << "VERT ix_lon ix_lat: " << ixlon  << " " << ixlat << endl;
+
+
+   /* select the 4 sky-vertices in case Naxes > 2 */
+
+   double vertlon[npoint];
+   double vertlat[npoint];
+
+   int in = 0;
+   if(npoint <= 4)
+   {
+      for(ip=0; ip<npoint; ip++)
+      {
+         vertlon[ip] = ptr_out[ixlon][ip];
+         vertlat[ip] = ptr_out[ixlat][ip];
+         in++;
+      }
+   }
+   else
+   {
+      vertlon[in] = ptr_out[ixlon][0];
+      vertlat[in] = ptr_out[ixlat][0];
+      in++;
+
+      LOG_STREAM << "Po[0] cnt: <ref>" << endl; 
+
+      /* ip=0 used as reference */
+      for(ip=1; ip<npoint; ip++)
+      {
+         int ic;
+         int cnt = 0;
+         for(ic=0; ic<ncoord_out; ic++)
+         {
+            if( (ic != ixlon) && (ic != ixlat) && (ptr_out[ic][ip] == ptr_out[ic][0]) ) cnt++;
+            /* asking for exact bit-pattern match on floating point numbers ptr_out[][] possible
+             * because axes are independent and so the same machine-code computation
+             * is performed on them which must yield the same result down to last bit */
+         }
+
+         bool coord_match = (cnt == (ncoord_out - 2));
+
+         LOG_STREAM << "Po[" << ip << "] cnt: " << cnt << endl; 
+
+         if(coord_match)
+         {
+            vertlon[in] = ptr_out[ixlon][ip];
+            vertlat[in] = ptr_out[ixlat][ip];
+            in++;
+         }
+#if 0
+         double lon = ptr_out[ixlon][ip];
+         double lat = ptr_out[ixlat][ip];
+
+         int ii;
+         bool found = false;
+         for(ii=0; ii<in; ii++)
+         {
+            int ulp = 2; // precision in units in the last place
+            if( almost_equal(vertlon[ii], lon, ulp) && almost_equal(vertlat[ii], lat, ulp) )
+            {
+               found = true;
+               break;
+            }
+         } 
+
+         if(!found)
+         {
+            vertlon[in] = ptr_out[ixlon][ip];
+            vertlat[in] = ptr_out[ixlat][ip];
+            in++;
+         }
+#endif
+      }
+   }
+
+   my_assert((in==4), __FILE__,__LINE__,"expected 4 vertices, but found " + to_string(in) + " from npoint: " + to_string(npoint));
+
+   vector<point2d> ret_vert;
+
+   for(ix=0; ix<in; ix++)
+   {
+      ret_vert.push_back(point2d{(R2D)*vertlon[ix], (R2D)*vertlat[ix]});
+
+      LOG_STREAM << "VERTx[" << ix << "] (deg): " << (R2D)*vertlon[ix] << " " << (R2D)*vertlat[ix] << endl;
+   }
+
+   return ret_vert;
+}
+
+
+
+
+
+
+void ast::frameset::assert_valid_state(void)
+{
+   LOG_trace(__func__);
+
+   /* check NAXIS */
+
+   my_assert((m_NAXIS >= 0)&&(m_NAXIS < AXES_CNT), __FILE__,__LINE__, ": NAXIS must be > 0 and <= " + to_string(AXES_CNT));
+   int ix;
+   for(ix=0; ix<m_NAXIS;ix++)
+      my_assert((m_NAXISn[ix] >= 0), __FILE__,__LINE__, ": NAXIS["+to_string(ix)+"]: " + to_string(m_NAXISn[ix]));
+
+   /* check spec-axis identification */
+
+   if(m_has_specframe)
+      my_assert( (m_spec_axis >= 1)/*&&(m_spec_axis <=m_NAXIS) FIXME should be Naxes of m_hdr_fs */,
+            __FILE__,__LINE__, ": m_spec_axis is " + to_string(m_spec_axis) + " and must be 1 .. " + to_string(m_NAXIS) );
+
+   /* check AstFrameSet */
+
+   my_assert((m_hdr_fs != AST__NULL), __FILE__,__LINE__, ": AstFrameSet is NULL");
+   /* FIXME add check internal to AstFrameSet:
+    * - FrameSet must have at least 2 Frames 
+    * - AST__BASE Frame must have only GRID domain <-- verify this
+    * - AST__CURRENT Frame must have SKY domain 
+    * - if AST__CURRENT Frame has also SpecFrame -> its axis must match m_spec_axis 
+    * - if spectral present and is VELO unit should be km/s FIXME see in overlap  
+    */
+} 
+
+
+
+std::ostream& operator<<(std::ostream& out, const ast::frameset& p)
+{
+   return p.serialize(out);
+}
+
+
+std::ostream& ast::frameset::serialize(std::ostream& ostrm) const
+{
+   ostrm << "ast::frameset: ";
+
+   ostrm << "NAXIS[";
+   int ix;
+   for(ix=0; ix<(m_NAXIS-1); ix++) ostrm << m_NAXISn[ix] << " ";
+   ostrm << m_NAXISn[m_NAXIS-1];
+   ostrm << "]";
+
+   ostrm << " Ast:";
+
+   const int n_frame = astGetI(m_hdr_fs,"Nframe");
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetI(Nframe)"));
+
+   const char * domain = astGetC(m_hdr_fs,"Domain");
+   if ( !astOK )
+      throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetC(Domain)"));
+
+   ostrm << "Domain(" << domain << ") ";
+
+
+   for(ix=0; ix<n_frame; ix++)
+   {
+      AstFrame * fr =  (AstFrame*)astGetFrame(m_hdr_fs, ix+1);
+
+      const char *astclass = astGetC( fr, "Class" );
+      //const char *title = astGetC( fr, "Title" );
+      const char *domain = astGetC( fr, "Domain" );
+      //const char *system = astGetC( fr, "System" );
+      const int naxes = astGetI(fr,"Naxes");
+      if ( !astOK )
+         throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetX()"));
+
+
+      ostrm << " ";
+      ostrm << astclass<< "[" << ix+1 << "]{";
+      ostrm << "Domain(" << domain << ")";
+
+      ostrm << " Axes[";
+      int axis;
+      for(axis=1; axis<=naxes; axis++)
+      {
+         const string domain_key{"Domain(" + to_string(axis) + ")"};
+         const string symbol_key{"Symbol(" + to_string(axis) + ")"};
+         const string system_key{"System(" + to_string(axis) + ")"};
+         const string unit_key{"Unit(" + to_string(axis) + ")"};
+
+         const char * domain  = astGetC(fr, domain_key.c_str());
+         const char * symbol  = astGetC(fr, symbol_key.c_str());
+         const char * system  = astGetC(fr, system_key.c_str());
+         const char * unit    = astGetC(fr, unit_key.c_str());
+         if ( !astOK )
+            throw runtime_error(failed_with_status(__FILE__,__LINE__,"astGetX()"));
+
+         ostrm <<domain << " " << symbol << " " << system << " " << unit;
+         if(axis != naxes) ostrm << " | ";
+      }
+      ostrm << "]";
+      ostrm << "}";
+   }
+
+
+   ostrm << " has specframe: " << m_has_specframe;
+   if(m_has_specframe)
+   {
+      ostrm << " at axis (one-based) " << to_string(m_spec_axis) << " ";
+
+      string unit_key{"Unit("+to_string(m_spec_axis)+")"};
+      const char * cunit = astGetC(m_hdr_fs, unit_key.c_str());
+      string unit(cunit);
+      LOG_STREAM << "SpecUnit(" << unit << ") ";
+
+      string std_of_rest_key{"StdOfRest("+to_string(m_spec_axis)+")"};
+      const char * cstd_of_rest = astGetC(m_hdr_fs, std_of_rest_key.c_str());
+      string std_of_rest(cstd_of_rest);
+      LOG_STREAM << "StdOfRest(" << std_of_rest << ")";
+   }
+
+   ostrm << " | ";
+
+   ostrm << "has stokes axis: " << m_has_stokes_axis;
+   if(m_has_stokes_axis)
+   {
+      ostrm << " at axis (one-based) " << to_string(m_stokes_axis) << " ";
+   }
+
+   ostrm << " | ";
+
+   ostrm << "has time axis: " << m_has_time_axis;
+   if(m_has_time_axis)
+   {
+      ostrm << " at axis (one-based) " << to_string(m_time_axis) << " ";
+
+      string system_key{"System("+to_string(m_time_axis)+")"};
+      const char * csystem = astGetC(m_hdr_fs, system_key.c_str());
+      string system(csystem);
+      LOG_STREAM << "System(" << system << ")";
+
+      string unit_key{"Unit("+to_string(m_time_axis)+")"};
+      const char * cunit = astGetC(m_hdr_fs, unit_key.c_str());
+      string unit(cunit);
+      LOG_STREAM << "TimeUnit(" << unit << ") ";
+   }
+
+   ostrm << endl; 
+
+   return ostrm;
+}
+
+void ast::frameset::log_NAXISn(void)
+{
+   LOG_STREAM << "NAXISn[AXES_CNT]: ";
+   int ix;
+   for(ix=0; ix<AXES_CNT;ix++) LOG_STREAM << m_NAXISn[ix] << " ";
+   LOG_STREAM << endl;
+}
+
+
+
+
+void ast::frameset::log_warnings(const AstFitsChan * fchan)
+{
+   LOG_trace(__func__);
+
+   // reports warnings of the last astRead or astWrite invocation
+
+   AstKeyMap *warnings;
+#define KEYLEN (15)
+   char key[ KEYLEN ];
+   short int iwarn;
+   const char *message;
+
+   if( astGetI( fchan, "ReportLevel" ) > 0 )
+   {
+      warnings = (AstKeyMap*)astWarnings( fchan );
+
+      if( warnings && astOK )
+      {
+         LOG_STREAM <<  "The following warnings were issued" << endl;
+
+         iwarn = 1;
+         while( astOK )
+         {
+            snprintf( key,KEYLEN, "Warning_%d", iwarn++ );
+            key[KEYLEN-1]='\0';
+            if( astMapGet0C( warnings, key, &message ) )
+            {
+               LOG_STREAM <<  string{key} << " - " <<  string{message} << endl;
+            }
+            else
+            {
+               break;
+            }
+         }
+      }
+      /* FIXME why clear ? */
+      astClearStatus;
+   }
+}
+
+
+
diff --git a/data-access/engine/src/common/src/ast_frameset.hpp b/data-access/engine/src/common/src/ast_frameset.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..23caabec215bba79f5ee0847d7f4847016635de0
--- /dev/null
+++ b/data-access/engine/src/common/src/ast_frameset.hpp
@@ -0,0 +1,82 @@
+#ifndef AST_FRAMESET_HPP
+#define AST_FRAMESET_HPP
+
+#include <string>
+extern "C" {
+#include <ast.h>
+}
+
+#include "cutout.hpp" // coordinates needed
+#include "ast4vl.hpp" // struct decls needed
+
+#include <array>
+#include <vector>
+#include <ostream>
+#include <string>
+
+
+namespace ast
+{
+
+class frameset
+{
+   public:
+      frameset(std::string header);
+      ~frameset();
+
+      bool has_specframe(void);
+      bool has_timeaxis(void);
+
+      void set(skysystem skysys);
+      void set(specsystem specsys);
+      void set(timesystem timesys);
+
+      void set_skysystem(std::string skysys_str);
+      void set_specsystem(std::string specsys_str);
+      void set_timesystem(std::string timesys_str);
+
+      std::vector<point2d> sky_vertices(void);
+      std::vector<Bounds> bounds(void);
+      overlap_ranges overlap(coordinates coord);
+
+      std::ostream& serialize(std::ostream& strm) const;
+
+   private:
+      void log_warnings(const AstFitsChan * fchan);
+
+      AstRegion * create_header_region(void);
+      void set_spec_axis(void);
+      void set_pol_time_axis(void);
+      void log_NAXISn(void);
+
+      void* find_skyframe(void);
+      void set_bounds_to_rect(coordinates& coord);
+      void assert_valid_state(void);
+
+      /* const*/ int m_NAXIS;
+      /* const */std::vector<int> m_NAXISn;
+      AstFrameSet * m_hdr_fs;
+
+      // FIXME can be part of state only if no external operation - currently set(skysys) set(specsys) - 
+      // causes axis permutation or then spec-axis number needs to be set again
+      // NOTE: currently used only in overlap()
+      bool m_has_specframe;
+      int m_spec_axis;
+
+      bool m_has_stokes_axis;
+      int m_stokes_axis;
+
+      bool m_has_time_axis;
+      int m_time_axis;
+};
+
+}
+
+
+
+
+std::ostream& operator<<(std::ostream& out, const ast::frameset& p);
+
+#endif
+
+
diff --git a/data-access/engine/src/common/src/cutout.cpp b/data-access/engine/src/common/src/cutout.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..3651f26f1571b1fb3bc62a4c039f4a1a4b4a94e9
--- /dev/null
+++ b/data-access/engine/src/common/src/cutout.cpp
@@ -0,0 +1,311 @@
+
+#include "cutout.hpp"
+#include "fitsfiles.hpp"
+#include "fits_header.hpp"
+#include "ast4vl.hpp"
+#include "cutout_nljson.hpp"
+#include "cutout_ostream.hpp"
+#include "json.hpp"
+#include "io.hpp"
+#include "my_assert.hpp"
+
+#include <string.h>
+#include <string>
+#include <fstream> // ofstream for tgz-file
+#include <sstream>
+#include <stdexcept>
+// for timestamp
+#include <iomanip>
+#include <chrono>
+#include <ctime>
+
+/* create_timestamp */
+#include <time.h>
+#include <sys/time.h>
+
+
+
+using namespace std;
+using json = nlohmann::json;
+
+coordinates::coordinates()
+{
+   skysys  = skysystem::NONE;
+   shape   = area::CIRCLE;
+   specsys = specsystem::NONE;
+   timesys = timesystem::NONE;
+}
+
+
+int convert_pol(std::string pol)
+{
+   // "I", "Q", "U", "V", "RR", "LL", "RL", "LR", "XX", "YY", "XY", "YX"
+        if(pol.compare("I") == 0) return 1;
+   else if(pol.compare("Q") == 0) return 2;
+   else if(pol.compare("U") == 0) return 3;
+   else if(pol.compare("V") == 0) return 4;
+
+   else if(pol.compare("RR") == 0) return -1;
+   else if(pol.compare("LL") == 0) return -2;
+   else if(pol.compare("RL") == 0) return -3;
+   else if(pol.compare("LR") == 0) return -4;
+   else if(pol.compare("XX") == 0) return -5;
+   else if(pol.compare("YY") == 0) return -6;
+   else if(pol.compare("XY") == 0) return -7;
+   else if(pol.compare("YX") == 0) return -8;
+   else
+      throw invalid_argument(pol + " is not a valid polarization state");
+}
+
+int min_pol_state(std::vector<std::string> pol)
+{
+   int min_p = 1000;
+   for(string p : pol)
+   {
+      int np = convert_pol(p);
+      if(np < min_p) min_p = np;
+   }
+   return min_p;
+}
+
+int max_pol_state(std::vector<std::string> pol)
+{
+   int max_p = -1000;
+   for(string p : pol)
+   {
+      int np = convert_pol(p);
+      if(np > max_p) max_p = np;
+   }
+   return max_p;
+}
+
+
+
+
+char * usec_timestamp(char * ts, size_t ts_len)
+{
+	struct timeval tv; 
+	time_t nowtime;
+	struct tm *nowtm;
+	char tmbuf[64];//, buf[64];
+
+	gettimeofday(&tv, NULL);
+	nowtime = tv.tv_sec;
+	nowtm = localtime(&nowtime);
+	strftime(tmbuf, sizeof tmbuf, "%Y-%m-%d_%H-%M-%S", nowtm);
+	snprintf(ts, ts_len, "%s_%06ld", tmbuf, (long)tv.tv_usec);
+
+	return ts; 
+}   
+
+string create_timestamp()
+{
+	const int TS_LEN = 256;
+	char ts[TS_LEN];
+	return string(usec_timestamp(ts,TS_LEN));
+}
+
+string generate_cut_fitsname(string pubdid, unsigned int hdunum)
+{
+
+   //pubdid = pubdid.substr(0,pubdid.find_last_of('_'));
+
+   string cutfitsname{"vlkb-cutout"};
+
+   string timestamp{create_timestamp()};
+   replace(pubdid.begin(), pubdid.end(), '/', '-');
+   replace(pubdid.begin(), pubdid.end(), ' ', '_');
+
+   // FIXME works only for one digit hdunum
+
+   if(hdunum == 1)
+   {
+      return cutfitsname + '_' + timestamp + '_' + pubdid;
+   }
+   else
+   {
+      string extn{"EXT" + to_string(hdunum-1)};
+      return cutfitsname + '_' + timestamp + '_' +  extn + '_' + pubdid;
+   }
+}
+
+
+
+
+int to_v_type(enum specsystem specsys)
+{
+	switch(specsys)
+	{
+		case specsystem::NONE: return 0;
+		case specsystem::VELO_LSRK: return 1;
+		case specsystem::WAVE_Barycentric: return 2;
+	}
+	return 0;
+}
+
+
+
+int to_v_valid(enum specsystem specsys)
+{
+	return specsys == specsystem::NONE ? 0 : 1;
+}
+
+
+
+string to_cfitsio_format(vector<uint_bounds> bounds)
+{
+	my_assert(!bounds.empty(),__FILE__,__LINE__,"bounds vector is empty" );
+
+	stringstream ss;
+	ss << "[";
+	ss << bounds[0].pix1 << ":" << bounds[0].pix2;
+	for(unsigned int i = 1; i < bounds.size(); i++)
+	{
+		ss << " " << bounds[i].pix1 << ":" << bounds[i].pix2;
+	}
+	ss << "]";
+	return ss.str();
+}
+
+coordinates to_coordinates(const position pos, const band bnd, const time_axis time, const std::vector<std::string> pol)
+{
+coordinates coord;
+   coord.skysys = pos.sys;
+
+   if(pos.shape == area::RANGE)
+      coord.shape = area::RECT;
+   else
+      coord.shape = pos.shape;
+
+   switch(coord.shape)
+   {
+      case area::CIRCLE:
+         {
+            coord.lon_deg  = pos.circ.lon;
+            coord.lat_deg  = pos.circ.lat;
+            coord.dlon_deg = 2.0 * pos.circ.radius;
+            coord.dlat_deg = coord.dlon_deg;
+         }
+         break;
+
+      case area::RECT:
+         {
+            coord.lon_deg  = (pos.rng.lon1 + pos.rng.lon2)/2.0;
+            coord.lat_deg  = (pos.rng.lat1 + pos.rng.lat2)/2.0;
+            coord.dlon_deg = pos.rng.lon2 - pos.rng.lon1;
+            coord.dlat_deg = pos.rng.lat2 - pos.rng.lat1;
+         }
+         break;
+
+      case area::POLYGON:
+         {
+            coord.p_lon_deg  = pos.poly.lon;
+            coord.p_lat_deg  = pos.poly.lat;
+         }
+         break;
+
+      default:
+         {
+            my_assert( false, __FILE__,__LINE__, " unknown shape in JSON: " + to_string(int(coord.shape)) );
+         }
+   }
+
+   coord.specsys = bnd.sys;
+   if(coord.specsys != specsystem::NONE)
+   {
+      coord.vl_kmps = bnd.band_value[0];
+      coord.vu_kmps = bnd.band_value[1];
+   }
+
+   coord.timesys = time.sys;
+   if(coord.timesys != timesystem::NONE)
+   {
+      coord.time_value[0] = time.time_value[0];
+      coord.time_value[1] = time.time_value[1];
+   }
+
+   coord.pol = pol;
+
+   return coord;
+}
+
+
+
+
+
+
+std::uintmax_t cutout_file(
+      const string abs_fits_pathname, unsigned int hdunum,
+      const coordinates coord,
+      const string vlkbcutout_pathname,
+      const vector<fits_card> extra_cards)
+{
+   LOG_trace(__func__);
+
+   string header{ fitsfiles::read_header(abs_fits_pathname, hdunum) };
+
+   my_assert(!header.empty(),__FILE__,__LINE__,"header is empty" );
+
+   if(!extra_cards.empty())
+   {
+      header = fitsfiles::append_card_if_not_in_header(header, extra_cards);
+   }
+
+   int ov_code;
+   vector<uint_bounds> bounds = calc_overlap(header, coord, ov_code);
+   //vector<uint_bounds> bounds = legacy::call_AST4VL_overlap(coord, header, ov_code);
+
+   if((ov_code==1)||(ov_code==6))
+      throw invalid_argument("given coordinates do not overlap with given fits-hdu area (ov_code = " + to_string(ov_code) +")");
+
+   string bounds_str = to_cfitsio_format(bounds);
+
+   LOG_STREAM << "bounds " << bounds_str << endl;
+
+   fitsfiles::fits_hdu_cut(abs_fits_pathname + bounds_str, hdunum, vlkbcutout_pathname);
+
+   if(!extra_cards.empty())
+   {
+      fits::header hdr(vlkbcutout_pathname, 1, READWRITE);
+      hdr.update(extra_cards);
+   }
+
+   LOG_STREAM << "cutout file: " + vlkbcutout_pathname << endl;
+
+   return fitsfiles::fileSize(vlkbcutout_pathname);
+}
+
+
+
+cutout_res_s do_cutout_file(
+      const std::string fits_pathname, unsigned int hdunum,
+      const position pos, const band bnd, const time_axis time, const std::vector<std::string> pol,
+      const bool count_null_values,
+      const std::vector<fits_card> extra_cards,
+      const std::string conf_fits_path,
+      const std::string conf_fits_cutpath)
+{
+   const string abs_subimg_pathname = conf_fits_cutpath + "/" + generate_cut_fitsname(fits_pathname, hdunum);
+   const string abs_fits_pathname{ conf_fits_path + "/" + fits_pathname };
+
+   coordinates coord = to_coordinates(pos, bnd, time, pol);
+
+   uintmax_t filesize = cutout_file(abs_fits_pathname, hdunum, coord, abs_subimg_pathname, extra_cards);
+
+   unsigned long long null_cnt  = 0;
+   unsigned long long total_cnt = 0;
+   double fill_ratio = -1.0;
+   if(count_null_values)
+   {
+      fill_ratio = fitsfiles::calc_nullvals(abs_subimg_pathname, /*hdunum*/1, null_cnt, total_cnt);
+   }
+
+   cutout_res_s cutres{ filesize, abs_subimg_pathname, {fill_ratio, null_cnt, total_cnt} };
+   return cutres;
+}
+
+
+
+
+
+
diff --git a/data-access/engine/src/common/src/cutout_nljson.cpp b/data-access/engine/src/common/src/cutout_nljson.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..8c7cc8ad7fd5837f897863283045f613c616565c
--- /dev/null
+++ b/data-access/engine/src/common/src/cutout_nljson.cpp
@@ -0,0 +1,279 @@
+
+#include "json.hpp"
+#include "io.hpp"
+#include "my_assert.hpp"
+#include "cutout.hpp"
+#include "cutout_nljson.hpp"
+
+#include <string>
+using namespace std;
+
+using json = nlohmann::json;
+
+
+NLOHMANN_JSON_SERIALIZE_ENUM( area, {
+      {area::CIRCLE, "CIRCLE"},
+      {area::RECT, "RECT"},
+      {area::RECT, "RANGE"},
+      {area::POLYGON, "POLYGON"}
+      });
+
+NLOHMANN_JSON_SERIALIZE_ENUM( skysystem, {
+      {skysystem::NONE, "NONE"},
+      {skysystem::GALACTIC, "GALACTIC"},
+      {skysystem::ICRS, "ICRS"}});
+
+NLOHMANN_JSON_SERIALIZE_ENUM( specsystem, {
+      {specsystem::NONE, "NONE"},
+      {specsystem::VELO_LSRK, "VELO_LSRK"}, // Local Standard of Rest, Kinematic
+      {specsystem::WAVE_Barycentric, "WAVE_Barycentric"},
+      });
+
+
+
+
+
+
+void from_json(const json& j, circle& p)
+{
+   j.at("lon").get_to(p.lon);
+   j.at("lat").get_to(p.lat);
+   j.at("radius").get_to(p.radius);
+}
+void from_json(const json& j, range& p)
+{
+   j.at("lon1").get_to(p.lon1);
+   j.at("lon2").get_to(p.lon2);
+   j.at("lat1").get_to(p.lat1);
+   j.at("lat2").get_to(p.lat2);
+}
+void from_json(const json& j, polygon& p)
+{
+   j.at("lon").get_to(p.lon);
+   j.at("lat").get_to(p.lat);
+}
+
+void from_json(const json& j, position& p)
+{
+   j.at("system").get_to(p.sys);
+
+   if(j.contains("circle"))
+   {
+      j.at("circle").get_to(p.circ);
+      p.shape = area::CIRCLE;
+   }
+   if(j.contains("range"))
+   {
+      j.at("range").get_to(p.rng);
+      p.shape = area::RANGE;
+   }
+   if(j.contains("polygon"))
+   {
+      j.at("polygon").get_to(p.poly);
+      p.shape = area::POLYGON;
+   }
+}
+
+
+void from_json(const json& j, band& p)
+{
+   j.at("system").get_to(p.sys);
+   j.at("interval").get_to(p.band_value);
+}
+
+void from_json(const json& j, time_axis& p)
+{
+   j.at("system").get_to(p.sys);
+   j.at("interval").get_to(p.time_value);
+}
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+void to_json(json& j, const fits_card& p)
+{
+   j = json{
+      {"key", p.key},
+         {"value", p.value},
+         {"comment", p.comment}
+   };
+}
+
+void from_json(const json& j, fits_card& p)
+{
+   string empty_str;
+   if(j.contains("key")) j.at("key").get_to(p.key); else p.key = empty_str;
+   if(j.contains("value")) j.at("value").get_to(p.value); else p.value = empty_str;
+   if(j.contains("comment")) j.at("comment").get_to(p.comment); else p.comment = empty_str;
+}
+
+
+
+
+// cutout
+
+void to_json(json& j, const nullvals_count_s& p)
+{
+   j = json{
+      {"fillratio", p.fill_ratio},
+         {"nullcount", p.null_count},
+         {"totalcount", p.total_count}
+   };
+}
+
+
+void to_json(json& j, const cutout_res_s& p)
+{
+   j = json{
+      {"filesize", p.filesize},
+         {"filename", p.filename},
+         {"nullvals_count", p.nullvals_count}
+   };
+}
+
+
+
+
+
+
+void to_json(json& j, const coordinates& p)
+{
+   switch(p.shape)
+   {
+      case area::RANGE:
+         my_assert(false, __FILE__, __LINE__, "area::RANGE no valid for struct coordinates");
+         break;
+      case area::POLYGON:
+         my_assert(false, __FILE__, __LINE__, "area::POLYGON no valid for struct coordinates");
+         break;
+
+      case area::CIRCLE:
+
+         // double value bit-equal e.g. was assigned not calculated dlat := dlon
+         my_assert( (p.dlon_deg == p.dlat_deg) ,
+               __FILE__,__LINE__,"coordinate dlon and dlat must be equal when shape is CIRCLE however "
+               + to_string(p.dlon_deg) +" vs " + to_string(p.dlat_deg) );
+
+         if(p.specsys == specsystem::NONE)
+         {
+            j = json{
+               {"skysystem", p.skysys},
+                  {"l",p.lon_deg},
+                  {"b",p.lat_deg},
+                  {"shape",p.shape},
+                  {"r",p.dlon_deg}
+            };
+         }
+         else
+         {
+            j = json{
+               {"skysystem", p.skysys},
+                  {"l",p.lon_deg},
+                  {"b",p.lat_deg},
+                  {"shape",p.shape},
+                  {"r",p.dlon_deg},
+                  {"specsystem",p.specsys},
+                  {"vl",p.vl_kmps},
+                  {"vu",p.vu_kmps}
+            };
+         }
+
+         break;
+
+      case area::RECT: 
+         if(p.specsys == specsystem::NONE)
+         {
+            j = json{
+               {"skysystem", p.skysys},
+                  {"l",p.lon_deg},
+                  {"b",p.lat_deg},
+                  {"shape",p.shape},
+                  {"dl",p.dlon_deg},
+                  {"db",p.dlat_deg}
+            };
+         }
+         else
+         {
+            j = json{
+               {"skysystem", p.skysys},
+                  {"l",p.lon_deg},
+                  {"b",p.lat_deg},
+                  {"shape",p.shape},
+                  {"dl",p.dlon_deg},
+                  {"db",p.dlat_deg},
+                  {"specsystem",p.specsys},
+                  {"vl",p.vl_kmps},
+                  {"vu",p.vu_kmps}
+            };
+         }
+         break;
+   }
+}
+
+
+void from_json(const json& j, coordinates& p)
+{
+   if(j.contains("skysystem"))
+      j.at("skysystem").get_to(p.skysys);
+   else
+      p.skysys = skysystem::GALACTIC;
+
+
+   j.at("l").get_to(p.lon_deg);
+   j.at("b").get_to(p.lat_deg);
+
+   string shape_str;
+
+   j.at("shape").get_to(shape_str);
+
+   if(shape_str.compare("CIRCLE") == 0)
+   {
+      j.at("r").get_to(p.dlon_deg);
+      p.dlon_deg = 2.0 * p.dlon_deg;
+      p.dlat_deg = p.dlon_deg;
+      p.shape = area::CIRCLE;
+   }
+   else if(shape_str.compare("RECT") == 0)
+   {
+      j.at("dl").get_to(p.dlon_deg);
+      j.at("db").get_to(p.dlat_deg);
+      p.shape = area::RECT;
+   }
+   else if(shape_str.compare("POLYGON") == 0)
+   {
+      j.at("lon").get_to(p.p_lon_deg);
+      j.at("lat").get_to(p.p_lat_deg);
+      p.shape = area::POLYGON;
+   }
+   else
+   {
+      my_assert( false, __FILE__,__LINE__, " unknown shape in JSON: " + shape_str );
+   }
+
+   if(j.contains("vl") || j.contains("vu"))
+   {
+      j.at("vl").get_to(p.vl_kmps);
+      j.at("vu").get_to(p.vu_kmps);
+
+      if(j.contains("specsystem"))
+         j.at("specsystem").get_to(p.specsys);
+      else
+         p.specsys = specsystem::VELO_LSRK;
+   }
+   else
+   {
+      p.specsys = specsystem::NONE;
+   }
+}
+
diff --git a/data-access/engine/src/common/src/cutout_ostream.cpp b/data-access/engine/src/common/src/cutout_ostream.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..3499df4329fd3a74d1361eb7815e39602fb37556
--- /dev/null
+++ b/data-access/engine/src/common/src/cutout_ostream.cpp
@@ -0,0 +1,126 @@
+
+#include "cutout_ostream.hpp"
+#include "cutout.hpp"
+#include "mcutout.hpp"
+#include "my_assert.hpp"
+
+#include <iomanip> // setw
+#include <iostream>
+#include <vector>
+
+using namespace std;
+
+
+std::string to_string(skysystem ss)
+{
+   string str;
+   switch(ss)
+   {
+      case skysystem::GALACTIC: str = "GALACTIC"; break;
+      case skysystem::ICRS: str = "ICRS"; break;
+      case skysystem::NONE: str = "NONE"; break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of skysystem type");
+   return str;
+}
+
+std::string to_string(area ss)
+{
+   string str;
+   switch(ss)
+   {
+      case area::CIRCLE: str = "CIRCLE"; break;
+      case area::RECT: str = "RECT"; break;
+      case area::RANGE: str = "RANGE"; break;
+      case area::POLYGON: str = "POLYGON"; break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of sky area type");
+   return str;
+}
+
+std::string to_string(specsystem ss)
+{
+   string str;
+   switch(ss)
+   {
+      case specsystem::NONE: str = "NONE"; break;
+      case specsystem::VELO_LSRK: str = "VELO_LSRK"; break;
+      case specsystem::WAVE_Barycentric: str = "WAVE_Barycentric"; break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of specsystem type");
+   return str;
+}
+
+std::string to_string(timesystem ss)
+{
+   string str;
+   switch(ss)
+   {
+      case timesystem::NONE: str = "NONE"; break;
+      case timesystem::MJD_UTC: str = "MJD_UTC"; break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of timesystem type");
+   return str;
+}
+
+
+std::ostream& operator<<( std::ostream &out, struct position const& p)
+{
+   string shape_ostring;
+
+   switch(p.shape)
+   {
+      case area::CIRCLE:
+         shape_ostring = to_string(p.circ.lon) + ", " + to_string(p.circ.lat) + "; " + to_string(p.circ.radius);
+         break;
+      case area::RANGE:
+         shape_ostring = to_string(p.rng.lon1) + " .. " + to_string(p.rng.lon2) + "; "
+            + to_string(p.rng.lat1) + " .. "+ to_string(p.rng.lat2);
+         break;
+      case area::RECT:
+         my_assert(false, __FILE__,__LINE__, "area::RECT is not valid for struct position");
+         break;
+      case area::POLYGON:
+         {
+            std::string poly_lon_str = "";
+            for(double dbl : p.poly.lon) { poly_lon_str += " " + to_string(dbl); }
+            std::string poly_lat_str = "";
+            for(double dbl : p.poly.lat) { poly_lat_str += " " + to_string(dbl); }
+            shape_ostring = " lon(" + poly_lon_str + ") lat(" + poly_lat_str +")";
+         }
+         break;
+   }
+
+   out << to_string(p.shape) << " " << shape_ostring;
+
+   return out;
+}
+
+std::ostream& operator<<( std::ostream &out, struct coordinates const& p)
+{
+
+   std::string pol_str = "";
+   for(string str : p.pol) { pol_str += " " + str; }
+
+   std::string poly_lon_str = "";
+   for(double dbl : p.p_lon_deg) { poly_lon_str += " " + to_string(dbl); }
+   std::string poly_lat_str = "";
+   for(double dbl : p.p_lat_deg) { poly_lat_str += " " + to_string(dbl); }
+
+   bool is_poly = (p.shape == area::POLYGON);
+
+
+   out << to_string(p.skysys) << " : (" << p.lon_deg << ", " << p.lat_deg << ") "
+      << to_string(p.shape) <<  " : ("
+      << (is_poly ? poly_lon_str : to_string(p.dlon_deg)) << ", "
+      << (is_poly ? poly_lat_str : to_string(p.dlat_deg)) << ") "
+      << to_string(p.specsys) << " [km/s] : (" << p.vl_kmps << ", " << p.vu_kmps << ") "
+//      << "POS(" << p.pos << ") "
+//      << "BAND(" << to_string(p.bandsys) << "; " << p.band_value[0]  << ", " << p.band_value[1] << ") "
+      << "TIME(" << to_string(p.timesys) << "; " << p.time_value[0]  << ", " << p.time_value[1] << ") "
+      << "POL(" << pol_str << ")";
+   return out;
+}
+
+
+
diff --git a/data-access/engine/src/common/src/fits_header.cpp b/data-access/engine/src/common/src/fits_header.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..61e601ffb04250d6cabba4b19c503daf81f6f181
--- /dev/null
+++ b/data-access/engine/src/common/src/fits_header.cpp
@@ -0,0 +1,684 @@
+/* example how to get filename from fptr: 
+   int fstatus = 0;
+   char filename[FLEN_FILENAME];
+   fits_file_name(fptr, filename, &fstatus);*/
+
+#include "cutout.hpp" // struct fits_card needed
+#include "fits_header.hpp"
+#include "fix_header.hpp"
+
+#include <stdlib.h> // malloc
+#include <string.h> // strcpy
+#include <fitsio.h>
+#include <stdexcept>
+
+using namespace std;
+
+
+string fits::header::cfitsio_errmsg(const char * filename, int line_num, int status)
+{
+   char errmsg[32]; // cfitsio doc says errmsg is "30 chars max"
+   errmsg[0] = 0;
+   fits_get_errstatus(status, errmsg);
+   string msg{"ERR["+to_string(status)+"] " + string{filename} + "." + to_string(line_num) + ":" + string{errmsg}};
+
+   return msg;
+}
+
+string fits::header::cfitsio_errmsg(int status)
+{
+   char errmsg[32]; // cfitsio doc says errmsg is "30 chars max"
+   errmsg[0] = 0;
+   fits_get_errstatus(status, errmsg);
+   string msg{"ERR["+to_string(status)+"] " + string{errmsg}};
+
+   return msg;
+}
+
+
+////////////////////////////////////////////////////////////////
+
+
+void fits::header::parseStrCards(std::map<std::string, std::string>& strCards)
+{
+   LOG_trace(__func__);
+
+   map<string,string>::iterator it = strCards.begin();
+   while(it != strCards.end())
+   {
+      string key = it->first;
+
+      char keyvalue[FLEN_VALUE];
+      int status = 0;
+      int rc = 0;
+      if((rc = fits_read_key_str(fptr, key.c_str(), keyvalue, NULL, &status)))
+      {
+         if((rc == KEY_NO_EXIST) || (rc == VALUE_UNDEFINED))
+         {
+            status = 0; // reset error, we handle this one
+            /*/ LOG_STREAM << "KEYS erase : " << key << " KEY_NO_EXIST or VALUE_UNDEFINED" << endl;*/
+            strCards.erase(key);
+         }
+         else
+         {
+            LOG_STREAM << cfitsio_errmsg(__FILE__,__LINE__,status) << endl; // FIXME exception ?
+         }
+      }
+      else
+      {
+         strCards[key] = keyvalue;// LOG_STREAM << "KEYS found : " << key << " : " << strCards[key] << endl;
+      }
+
+      it++;
+   }
+}
+
+void fits::header::parseUIntCards(std::map<std::string, unsigned long>& uintCards)
+{
+   LOG_trace(__func__);
+
+   map<string, unsigned long>::iterator it = uintCards.begin();
+   while(it != uintCards.end())
+   {
+      string key = it->first;
+
+      unsigned long keyvalue;
+      int status = 0;
+      int rc = 0;
+      if((rc = fits_read_key_lng(fptr, key.c_str(), (long int*)&keyvalue, NULL, &status)))
+      {
+         if((rc == KEY_NO_EXIST) || (rc == VALUE_UNDEFINED))
+         {  
+            status = 0; // reset error, we handle this one
+            /*/ LOG_STREAM << "KEYS erase : " << key << " KEY_NO_EXIST or VALUE_UNDEFINED" << endl;*/
+            uintCards.erase(key);
+         }
+         else
+         {  
+            LOG_STREAM << cfitsio_errmsg(__FILE__,__LINE__,status) << endl;// FIXME exception ?
+         }
+      }
+      else
+      {  
+         uintCards[key] = keyvalue;// LOG_STREAM << "KEYS found : " << key << " : " << uintCards[key] << endl;
+      }
+      it++;
+   }
+}
+
+void fits::header::parseDoubleCards(std::map<std::string, double>& doubleCards)
+{
+   LOG_trace(__func__);
+
+   map<string, double>::iterator it = doubleCards.begin();
+   while(it != doubleCards.end())
+   {
+      string key = it->first;
+
+      double keyvalue;
+      int status = 0;
+      int rc = 0;
+      if((rc = fits_read_key_dbl(fptr, key.c_str(), &keyvalue, NULL, &status)))
+      {
+         if((rc == KEY_NO_EXIST) || (rc == VALUE_UNDEFINED))
+         {
+            status = 0; // reset error, we handle this one
+            /*/ LOG_STREAM << "KEYS erase : " << key << " KEY_NO_EXIST or VALUE_UNDEFINED" << endl;*/
+            doubleCards.erase(key);
+         }
+         else
+         {
+            LOG_STREAM << cfitsio_errmsg(__FILE__,__LINE__,status) << endl;            // FIXME exception?
+         }
+      }
+      else
+      {
+         doubleCards[key] = keyvalue;         // LOG_STREAM << "KEYS found : " << key << " : " << doubleCards[key] << endl;
+      }
+      it++;
+   }
+}
+
+////////////////////////////////
+
+std::map<std::string, std::string> fits::header::parse_string_cards(std::set<std::string> str_keys)
+{
+   map<std::string, std::string> key_value_map;
+
+   set<string>::iterator it = str_keys.begin();
+   while(it != str_keys.end())
+   {
+      string key = *it;
+
+      char keyvalue[FLEN_VALUE];
+      int status = 0;
+      int rc = 0;
+      if((rc = fits_read_key_str(fptr, key.c_str(), keyvalue, NULL, &status)))
+      {
+         if((rc == KEY_NO_EXIST) || (rc == VALUE_UNDEFINED))
+         {
+            status = 0; // reset expected error, we handle this one
+         }
+         else
+         {
+            throw runtime_error("error reading card: " + key); 
+         }
+      }
+      else
+      {
+         key_value_map.insert(pair<string, string>(key, keyvalue));
+      }
+
+      it++;
+   }
+
+   return key_value_map;
+}
+
+std::map<std::string, unsigned long int> fits::header::parse_uint_cards(std::set<std::string> str_keys)
+{
+   map<std::string, unsigned long int> key_value_map;
+
+   set<string>::iterator it = str_keys.begin();
+   while(it != str_keys.end())
+   {
+      string key = *it;
+
+      long int keyvalue;
+      int status = 0;
+      int rc = 0;
+      if((rc = fits_read_key_lng(fptr, key.c_str(), (long int*)&keyvalue, NULL, &status)))
+      {
+         if((rc == KEY_NO_EXIST) || (rc == VALUE_UNDEFINED))
+         {
+            status = 0; // reset expected error, we handle this one
+         }
+         else
+         {
+            throw runtime_error("error reading card: " + key); 
+         }
+      }
+      else
+      {
+         if(keyvalue < 0) throw invalid_argument("card value must not be negative: " + key + " " + to_string(keyvalue));
+         unsigned long int ulong_keyvalue = (unsigned long int)keyvalue;
+         key_value_map.insert(pair<string, unsigned long int>(key, ulong_keyvalue));
+      }
+
+      it++;
+   }
+
+   return key_value_map;
+}
+
+std::map<std::string, double> fits::header::parse_double_cards(std::set<std::string> str_keys)
+{
+   map<std::string, double> key_value_map;
+
+   set<string>::iterator it = str_keys.begin();
+   while(it != str_keys.end())
+   {
+      string key = *it;
+
+      double keyvalue;
+      int status = 0;
+      int rc = 0;
+      if((rc = fits_read_key_dbl(fptr, key.c_str(), &keyvalue, NULL, &status)))
+      {
+         if((rc == KEY_NO_EXIST) || (rc == VALUE_UNDEFINED))
+         {
+            status = 0; // reset expected error, we handle this one
+         }
+         else
+         {
+            throw runtime_error("error reading card: " + key); 
+         }
+      }
+      else
+      {
+         key_value_map.insert(pair<string, double>(key, keyvalue));
+      }
+
+      it++;
+   }
+
+   return key_value_map;
+}
+
+
+fitsfiles::key_values_by_type fits::header::parse_cards(fitsfiles::keys_by_type keys)
+{
+   return fitsfiles::key_values_by_type{
+      parse_string_cards(keys.strKeys),
+         parse_uint_cards(keys.uintKeys),
+         parse_double_cards(keys.doubleKeys)};
+}
+
+/////////////////////////////////////////////////////////////
+
+bool fits::header::contains_card(std::string keyname)
+{
+   int status = 0;
+   char card[FLEN_CARD];
+   bool found = false;
+
+   fits_read_card(fptr, keyname.c_str(), card, &status);
+   switch(status)
+   {
+      case 0:            found = true; break;
+      case KEY_NO_EXIST: found = false; break;
+      default: throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status) + " key: " + keyname);
+   }
+
+   return found;
+}
+
+
+string fits::header::read_card(std::string keyname)
+{
+   int status = 0;
+   char card[FLEN_CARD];
+
+   fits_read_card(fptr, keyname.c_str(), card, &status);
+
+   switch(status)
+   {
+      case 0:
+         ; break;
+      case KEY_NO_EXIST:
+         throw invalid_argument("keyname '" + keyname + "' not found in the given FITS-file's HDU[" + to_string(hdunum) + "]");
+      default: 
+         throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status) + " keyname: " + keyname);
+   }
+
+   return string{card};
+}
+
+
+int fits::header::get_nkeys()
+{
+   int status = 0;
+   int nkeys;
+   if(!fits_get_hdrspace(fptr, &nkeys, NULL, &status))
+   {
+      return nkeys;
+   }
+   else
+   {
+      throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status)); 
+   }
+}
+
+
+// opens for write pathname[hdunum]
+// update() and fix() operate on this opened file
+void fits::header::open_fitsfile(std::string pathname, unsigned long hdunum, int iomode)
+{
+   LOG_trace(__func__);
+
+   int status = 0;
+   string header;
+
+   if ( !fits_open_file(&fptr, pathname.c_str(), iomode, &status) )
+   {
+      if( !fits_movabs_hdu(fptr, hdunum, NULL, &status) )
+      {
+         this->hdunum = hdunum;
+         int hdutype = -1; 
+         if( !fits_get_hdu_type(fptr, &hdutype, &status) )
+         {
+            switch(hdutype)
+            {
+               case IMAGE_HDU:
+
+                  int nkeys;
+                  if(!fits_get_hdrspace(fptr, &nkeys, NULL, &status))
+                  {
+                     if(nkeys > 0)
+                     {
+                        ;//this->nkeys = nkeys; // ok
+                     }
+                     else
+                     {
+                        fits_close_file(fptr, &status);
+                        if(status) LOG_STREAM << "fits_close_file: " << cfitsio_errmsg(__FILE__,__LINE__,status) << endl;
+                        throw runtime_error(__func__ + string(" : unexpected empty header in file ") + pathname );
+                     }
+                  }
+                  break;
+
+               default:
+                  fits_close_file(fptr, &status);
+                  LOG_STREAM << "fits_close_file: " << cfitsio_errmsg(__FILE__,__LINE__,status) << endl;
+                  throw runtime_error(__func__
+                        + string{" : HDU["}+to_string(hdunum)+"] has incorrect type : "
+                        + to_string(hdutype) + " should be IMAGE" );
+                  break;
+            }
+         }
+      }
+      else
+      {
+         LOG_STREAM << cfitsio_errmsg(__FILE__,__LINE__,status)
+            + " : cannot move to hdunum: " + to_string(hdunum) + " in file " + pathname << endl;
+         throw invalid_argument(cfitsio_errmsg(status)
+               + " : cannot move to hdunum: " + to_string(hdunum) + " in file " + pathname);
+      }
+   }
+   else
+   {
+      LOG_STREAM << cfitsio_errmsg(__FILE__,__LINE__,status) + " : " + pathname << endl;
+      throw invalid_argument(cfitsio_errmsg(status) + " : " + pathname);
+   }
+
+   if(status) throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status) + " : " + pathname);
+}
+
+
+
+int fits::header::read_record(int keynum, char *card, int *status)
+{
+   return fits_read_record(fptr, keynum, card, status);
+}
+
+string keytype_to_string(int keytype)
+{
+   string keytype_text;
+
+   // ref: cfitsio manual, fits_parse_template() func description
+   switch(keytype)
+   {
+      case -2: keytype_text = "rename keyword"; break;
+      case -1: keytype_text = "delete keyword"; break;
+      case  0: keytype_text = "append/update key-record"; break;
+      case  1: keytype_text = "append/update HISTORY or COMMENT key-record"; break;
+      case  2: keytype_text = "END record (not written explicitely)"; break;
+      default: throw runtime_error(string{__FILE__} + ":" + to_string(__LINE__)
+                     + ": unknown keytype returned from fits_parse_template");
+   }
+
+   return keytype_text;
+}
+
+void fits::header::update_card(const struct fits_card new_card)
+{
+   LOG_trace(__func__);
+
+   int status = 0;
+
+   char errtxt[1024];
+   char card[FLEN_CARD];
+   card[0] = 0;
+
+   const char * keystr = new_card.key.c_str();
+
+   bool is_card_in_header = ( 0 == fits_read_card(fptr, keystr, card, &status) );
+
+   if (is_card_in_header)
+   {
+      LOG_STREAM << __func__ << " found card[" << keystr  << "]: " << card << endl;
+      return;
+   }
+
+   // try to continue even if real error -> fits_update_card will fail
+   fits_get_errstatus(status, errtxt);
+   LOG_STREAM << __func__ << " fits_read_card status: " << string(errtxt) << endl;
+   status = 0; // reset after expected error
+
+   // add card
+
+   const string newcard_template{ new_card.key + " = " + new_card.value + " / " + new_card.comment };
+
+   // reformat the keyword string to conform to FITS rules
+   int keytype;
+   char buff[1024];
+   strcpy(buff, newcard_template.c_str());
+   if(!fits_parse_template(buff, card, &keytype, &status))
+   {
+      LOG_STREAM << __func__ << " adding card[" << keytype  << "] >" << card << "< " << keytype_to_string(keytype) << endl;
+
+      fits_update_card(fptr, keystr, card, &status);
+   }
+
+   if(status)
+      throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status) + " key: " + new_card.key);
+}
+
+
+
+
+void fits::header::update(const vector<fits_card> additional_cards)
+{
+   LOG_trace(__func__);
+
+   for(fits_card new_card : additional_cards)
+   {
+      update_card(new_card);
+   }
+}
+
+
+
+
+
+string fits::header::get_header(bool apply_fixes)
+{
+   LOG_trace(__func__);
+
+   string header;
+
+   int status = 0;
+   char * header_cstr;
+   const int nocomments = 1;
+   int nkeys;
+
+   if(fits_hdr2str(fptr, nocomments, NULL, 0, &header_cstr, &nkeys, &status))
+   {
+      throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status));
+   }
+   else
+   {
+      if(header_cstr == NULL)
+      {
+         throw runtime_error(__func__ + string(" : fits_hdr2str returned NULL header string"));
+      }
+      else
+      {
+         LOG_STREAM << "header length: " << strlen(header_cstr) << endl;
+
+         if(apply_fixes)
+         {
+            /* FIXME verify header modifs:
+             * GRS FCRAO:  VELOCITY -> VELO____
+             * HI_VGPS:    VELO-LSR -> VELO____
+             * and:        M/S      -> m/s
+             */
+            fix_header(header_cstr);
+         }
+
+         header = header_cstr;
+
+         fits_free_memory(header_cstr, &status);
+         if(status) LOG_STREAM << cfitsio_errmsg(__FILE__,__LINE__,status) << endl;
+      }
+   }
+
+   return header;
+}
+
+
+
+
+/* FIXME the only data-unit access -> consider renaming  fits_header -> fits_hdu  ? */
+
+double fits::header::calc_nullvals(unsigned long long & null_cnt, unsigned long long & tot_cnt)
+{
+   LOG_trace(__func__);
+
+   int ii, status = 0;
+
+   /* informative only, not used */
+   int equivbitpix = 0;
+   if(fits_get_img_equivtype(fptr, &equivbitpix, &status))
+   {
+      throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status));
+   }
+   LOG_STREAM << " BITPIX (equivalent): " << equivbitpix << endl;
+
+
+   /* get BLANK for int-HDU's */
+
+   int bitpix = 0;
+   if(fits_get_img_type(fptr, &bitpix, &status))
+   {
+      throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status));
+   }
+   LOG_STREAM << " BITPIX: " << bitpix << endl;
+
+   long long blank;
+   //bool blank_found = false;
+   switch(bitpix)
+   {
+      case BYTE_IMG:
+      case SHORT_IMG:
+      case LONG_IMG:
+      case LONGLONG_IMG:
+         if(fits_read_key(fptr,TLONGLONG,"BLANK", &blank,NULL,&status))
+         {
+            status = 0;
+            //blank_found = true;
+            LOG_STREAM << "BLANK not found in header" << endl;
+         }
+         else
+         {
+            //blank_found = false;
+            LOG_STREAM << "BLANK: " << blank << endl;
+         }
+         break;
+      case FLOAT_IMG:
+      case DOUBLE_IMG:
+         break;
+      default:
+         throw runtime_error(string{string(__FILE__) + to_string(__LINE__) + "Unknown BITPIX value" });
+   }
+
+
+   /* get image size */
+
+   int anaxis;
+   if(fits_get_img_dim(fptr, &anaxis, &status))
+   {
+      throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status));
+   }
+
+   const int MAXAXES = 5;
+   if (anaxis > MAXAXES)
+   {
+      throw runtime_error(string{string(__FILE__) + to_string(__LINE__)
+            + "This build supports max " + to_string(MAXAXES) + " axes, but this HDU has " + to_string(anaxis) });
+   }
+
+   long int firstpix[MAXAXES] = {1,1,1,1,1};
+   long int anaxes[MAXAXES]   = {1,1,1,1,1};
+   if(fits_get_img_size(fptr, MAXAXES, (long int*) anaxes, &status))// FIXME explicit conversion
+   {
+      throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status));
+   }
+
+   LOG_STREAM << "NDIM(" << anaxis << "):";
+   for (ii=0;ii<anaxis;ii++) LOG_STREAM << " " << anaxes[ii];
+   LOG_STREAM << endl;
+
+
+   /* alloc buffers for one row */
+
+   double *apix = NULL;   // converted pixels
+   char   *anul = NULL;   // true if pixel undefined
+
+   long npixels = anaxes[0];  /* no. of pixels to read in each row */
+
+   /* FIXME review: sizeof(double) matches with TDOUBLE in fits_read_pixnull */
+
+   my_assert(sizeof(double) == 8, __FILE__, __LINE__, "malloc assumes 64bit floating point but sizeof(double) is " + to_string(sizeof(double)));
+
+   apix = (double *) malloc(npixels * sizeof(double)); // mem for 1 row: pixels
+   if (apix == NULL)
+   {
+      throw runtime_error(string{string(__FILE__) + to_string(__LINE__)
+            + " failed malloc for one row of pixels" });
+   }
+   anul = (char *) malloc(npixels * sizeof(char));   // mem for 1 row: nulls
+   if (anul == NULL)
+   {
+      free(apix);
+      throw runtime_error(string{string(__FILE__) + to_string(__LINE__)
+            + " failed malloc for one row of nulls" });
+   }
+
+
+   /* loop throu each pixel and count undefined values */
+
+   unsigned long long nulcnt = 0; // count undefined pixels
+   unsigned long long totcnt = 0; // count all pixels
+                                  // 5th dim
+   for (firstpix[4] = 1; firstpix[4] <= anaxes[4]; firstpix[4]++)
+   {
+      // 4th dim
+      for (firstpix[3] = 1; firstpix[3] <= anaxes[3]; firstpix[3]++)
+      {
+         // 3rd dim
+         for (firstpix[2] = 1; firstpix[2] <= anaxes[2]; firstpix[2]++)
+         {
+            /* loop over all rows of the plane */
+            for (firstpix[1] = 1; firstpix[1] <= anaxes[1]; firstpix[1]++)
+            {
+               // Give starting pixel coordinate and no. of pixels to read.
+               // nulval = 0.0; // 0 = don't convert, return nan & don't return anynul
+               // Other than zero: convert nans to nullval, return anynul set to 0/1
+
+               int anynul = -1;
+
+               // apix: one row of data, npixels long.
+               // All values (including BLANK defined NULL values) are scaled by
+               // BZERO * rawval*BSCALE
+
+               // RBu somewhere on the net: (ref guide does not explain anynul)
+               //  The location of any blank pixels (as determined by the header keywords) is stored
+               //  in nullarray (set to 1). Also anynul will be set to 1 to indicate the presence of
+               //  blank pixels. If anynul==1 then all pixels are non-blank.
+               //  if(anynul==0)... // no blank pixels, so don't bother with any trimming or checking...
+
+               /* TDOUBLE biggest in cfitsio 64 bit */
+               if(fits_read_pixnull(fptr, TDOUBLE, firstpix, npixels,
+                        apix, anul,
+                        &anynul, &status))
+               {
+                  free(anul);
+                  free(apix);   
+                  throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status));
+               }
+
+               for(ii=0; ii< npixels; ii++)
+               {
+                  totcnt++;
+                  if(anul[ii]) nulcnt++;
+               }
+            }
+         }// end 3rd dim
+      }// end 4th dim
+   }// end 5th dim
+
+   free(anul);
+   free(apix);
+
+
+   /* format result string */
+
+   double nullfill = 100.0*(double)nulcnt/(double)totcnt;
+
+   null_cnt = nulcnt;
+   tot_cnt  = totcnt;
+   return nullfill;
+}
+
+
diff --git a/data-access/engine/src/common/src/fits_header.hpp b/data-access/engine/src/common/src/fits_header.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..928e2dc35dcc7d622b1c5f33400cb6bb4c83b123
--- /dev/null
+++ b/data-access/engine/src/common/src/fits_header.hpp
@@ -0,0 +1,82 @@
+#ifndef FITS_HEADER_HPP
+#define FITS_HEADER_HPP
+
+#include "io.hpp"
+#include "my_assert.hpp"
+#include "cutout.hpp" // fits_card
+#include "fitsfiles.hpp"
+#include <fitsio.h>
+
+#include <iostream>
+#include <vector>
+#include <string>
+#include <map>
+
+namespace fits
+{
+
+class header
+{
+   public:
+
+      header() : hdunum{1}, fptr{nullptr}
+      {
+         LOG_trace(std::string{__func__} + " (explicit default constructor)");
+      }
+
+      header(std::string pathname, unsigned long hdunum, int iomode = READONLY)
+         : hdunum{hdunum}, fptr{nullptr}
+      {
+         LOG_trace(__func__);
+         open_fitsfile(pathname, hdunum, iomode);
+      };
+
+      ~header()
+      {
+         LOG_trace(__func__);
+
+         if(fptr != nullptr)
+         {
+            LOG_STREAM << "closing disk-file" << std::endl;
+            int status = 0;
+            fits_close_file(fptr,&status);
+            if(status) LOG_STREAM << "fits_close_file: " << cfitsio_errmsg(__FILE__,__LINE__,status) << std::endl;
+         }
+      };
+
+      int read_record(int keynum, char *card, int *status);
+      int get_nkeys();
+      void update(const std::vector<fits_card> additional_cards);
+      std::string get_header(bool apply_fixes = false);
+      std::string read_card(std::string keyname);
+      bool contains_card(std::string keyname);
+
+      // FIXME unify card-interface: above uses fits_card vs below map<string-key, fits-type>
+
+      // support ObsCoreKeys
+      void parseStrCards(std::map<std::string, std::string>& strCards);
+      void parseUIntCards(std::map<std::string, unsigned long>& uintCards);
+      void parseDoubleCards(std::map<std::string, double>& doubleCards);
+
+      std::map<std::string, std::string>   parse_string_cards(std::set<std::string> str_keys);
+      std::map<std::string, unsigned long> parse_uint_cards(std::set<std::string> str_keys);
+      std::map<std::string, double>        parse_double_cards(std::set<std::string> str_keys);
+      fitsfiles::key_values_by_type parse_cards(fitsfiles::keys_by_type keys);
+
+      double calc_nullvals(unsigned long long & null_cnt, unsigned long long & tot_cnt);
+
+   protected:
+
+      std::string cfitsio_errmsg(const char * filename, int line_num, int status);
+      std::string cfitsio_errmsg(int status);
+      void open_fitsfile(std::string pathname_template, unsigned long hdunum, int iomode);
+      void update_card(const struct fits_card new_card);
+
+      unsigned long hdunum;
+      fitsfile * fptr;
+};
+
+}
+
+#endif
+
diff --git a/data-access/engine/src/common/src/fitsfiles.cpp b/data-access/engine/src/common/src/fitsfiles.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..e33e0f30d435772e4258446a4693f1ca03e799f6
--- /dev/null
+++ b/data-access/engine/src/common/src/fitsfiles.cpp
@@ -0,0 +1,370 @@
+
+#include "fitsfiles.hpp"
+#include "cutout.hpp" // struct fits_card needed
+
+#include "io.hpp"
+
+#include "fits_header.hpp"
+
+#include <cstring>
+#include <string>
+#include <vector>
+#include <stdexcept>
+
+// C libs
+#include <stdlib.h>// malloc
+#include <glob.h>
+#include <fitsio.h>
+#include <string.h> // strerror(errno) needed
+
+#include <libgen.h> // basename() needed
+#include <sys/stat.h> // stat - filesize needed
+
+using namespace std;
+
+
+
+
+uintmax_t fitsfiles::fileSize(string pathname)
+{
+   struct stat st; 
+   int rc = stat(pathname.c_str(), &st);
+
+   if(rc != 0)
+      throw runtime_error(string{__FILE__} + ":" + to_string(__LINE__) + ": " + string{strerror(rc)});
+
+   off_t size = st.st_size;
+
+   return size;
+}
+
+
+
+double fitsfiles::calc_nullvals(string pathname, unsigned int hdunum,
+      unsigned long long & null_cnt, unsigned long long & total_cnt)
+{
+   LOG_trace(__func__);
+
+   fits::header hdr(pathname, hdunum, READONLY);
+
+   return hdr.calc_nullvals(null_cnt, total_cnt);
+}
+
+
+
+// FIXME check glob ret value empty & check dir not exist or no access etc...
+// (cases when command exe'd on different machine)
+// list only files (distinguish files vs dirs)
+// hidden files not listed
+// no dir recursion
+vector<string> fitsfiles::globVector(const string& pattern)
+{
+   glob_t glob_result;
+   glob(pattern.c_str(),GLOB_TILDE,NULL,&glob_result);
+   vector<string> files;
+   for(unsigned int i=0;i<glob_result.gl_pathc;++i){
+      files.push_back(string(glob_result.gl_pathv[i]));
+   }
+   globfree(&glob_result);
+   return files;
+}
+
+
+
+// class header based
+
+
+
+string fitsfiles::cfitsio_errmsg(const char * filename, int line_num, int status)
+{
+   char errmsg[32]; // cfitsio doc says errmsg is "30 chars max"
+   errmsg[0] = 0;
+   fits_get_errstatus(status, errmsg);
+   string msg{"ERR["+to_string(status)+"] " + string{filename} + "." + to_string(line_num) + ":" + string{errmsg}};
+
+   return msg;
+}
+
+
+string fitsfiles::read_header(string pathname, unsigned int hdunum)
+{
+   LOG_trace(__func__);
+
+   const bool apply_fixes = true;
+
+   fits::header hdr(pathname, hdunum, READONLY);
+   return hdr.get_header(apply_fixes);
+}
+
+
+string fitsfiles::read_card(const std::string pathname, unsigned int hdunum, const string keyname)
+{
+   fits::header hdr(pathname, hdunum);
+   string card{hdr.read_card(keyname)};
+
+   return card;
+}
+
+
+void fitsfiles::add_cards_if_missing(const std::string pathname, unsigned int hdunum, const std::vector<struct fits_card> cards)
+{
+   fits::header hdr(pathname, hdunum, READWRITE);
+   hdr.update(cards);
+}
+
+
+void fitsfiles::fits_hdu_cut(const string infile, const unsigned int hdunum,
+      const string outfile)
+{
+   LOG_trace(__func__);
+
+   LOG_STREAM << "input infile : " << infile << " hdunum: "<< to_string(hdunum) <<endl;
+   LOG_STREAM << "input outfile: " << outfile << endl;
+
+   int status = 0;
+
+   fitsfile *infptr, *outfptr; 
+
+   // cut
+
+   if ( !fits_open_file(&infptr, infile.c_str(), READONLY, &status) )
+   {
+      if ( !fits_create_file(&outfptr, outfile.c_str(), &status) )
+      {
+         if( !fits_movabs_hdu(infptr, hdunum, NULL, &status) )
+         {
+            int morekeys = 0; // reserve space for more keys in destination
+            if(!fits_copy_hdu(infptr, outfptr, morekeys, &status))
+            {
+               LOG_STREAM << "fits_copy_hdu  status: " << status << endl;
+            }
+         }
+         else
+         {
+            LOG_STREAM << cfitsio_errmsg(__FILE__,__LINE__,status) << endl;
+         }
+         // Reset status after normal error
+         if (status == END_OF_FILE) status = 0;
+
+         fits_close_file(outfptr,  &status);
+      }
+      else
+      {
+         string errmsg{fitsfiles::cfitsio_errmsg(__FILE__,__LINE__,status)};
+         fits_close_file(infptr, &status);
+         throw runtime_error(errmsg + " " + outfile);
+      }
+      fits_close_file(infptr, &status);
+   }
+   else
+   {
+      throw runtime_error(fitsfiles::cfitsio_errmsg(__FILE__,__LINE__,status) + " " + infile);
+   }
+
+   if(status) 
+   {
+      throw runtime_error(fitsfiles::cfitsio_errmsg(__FILE__,__LINE__,status) + " " + infile);
+   }
+}
+
+
+// adds cards or modyfies card value - replaces run-time fix.c for this in ingestion-timw
+int fitsfiles::mod_value(string filename, string token, string keyvalue)
+{
+   LOG_trace(__func__);
+
+   fitsfile *fptr;         /* FITS file pointer, defined in fitsio.h */
+   char card[FLEN_CARD], newcard[FLEN_CARD];
+   char oldvalue[FLEN_VALUE], comment[FLEN_COMMENT];
+   int status = 0;   /*  CFITSIO status value MUST be initialized to zero!  */
+   int iomode, keytype;
+
+   bool do_update = !keyvalue.empty();
+
+   if (do_update)
+      iomode = READWRITE;
+   else
+      iomode = READONLY;
+
+   if (!fits_open_file(&fptr, filename.c_str(), iomode, &status))
+   {
+      //if (fits_read_card(fptr, keyname.c_str(), card, &status))
+      if (fits_read_str(fptr,token.c_str(), card, &status))
+      {
+         printf("Keyword does not exist\n");
+         card[0] = '\0';
+         comment[0] = '\0';
+         status = 0;  /* reset status after error */
+         strcpy(card,token.c_str());
+         //goto f_end;
+      }
+      else
+      {
+         printf("%s\n",card);
+      }
+
+      if (do_update)  /* write or overwrite the keyword */
+      {
+         char keyname[80];
+         int keylength;
+         fits_get_keyname(card, keyname, &keylength, &status);
+
+         /* check if this is a protected keyword that must not be changed */
+         if (*card && fits_get_keyclass(card) == TYP_STRUC_KEY)
+         {
+            printf("Protected keyword cannot be modified.\n");
+         }
+         else
+         {
+            /* get the comment string */
+            if (*card)fits_parse_value(card, oldvalue, comment, &status);
+
+            /* construct template for new keyword */
+            strcpy(newcard, keyname);     /* copy keyword name */
+            strcat(newcard, " = ");       /* '=' value delimiter */
+            strcat(newcard, keyvalue.c_str());     /* new value */
+            if (*comment) {
+               strcat(newcard, " / ");  /* comment delimiter */
+               strcat(newcard, comment);     /* append the comment */
+            }
+
+            /* reformat the keyword string to conform to FITS rules */
+            fits_parse_template(newcard, card, &keytype, &status);
+
+            /* overwrite the keyword with the new value */
+            fits_update_card(fptr, keyname, card, &status);
+
+            printf("Keyword has been changed to:\n");
+            printf("%s\n",card);
+         }
+      }
+f_end:
+      fits_close_file(fptr, &status);
+   }
+
+   /* if error occured, print out error message */
+   if (status) fits_report_error(stderr, status);
+
+   return(status);
+}
+
+/* FIXME there is also in fits_header - keep only one */
+string keytype_to_string2(int keytype)
+{
+   string keytype_text;
+
+   // ref: cfitsio manual, fits_parse_template() func description
+   switch(keytype)
+   {
+      case -2: keytype_text = "rename keyword"; break;
+      case -1: keytype_text = "delete keyword"; break;
+      case  0: keytype_text = "append/update key-record"; break;
+      case  1: keytype_text = "append/update HISTORY or COMMENT key-record"; break;
+      case  2: keytype_text = "END record (not written explicitely)"; break;
+      default: throw runtime_error(string{__FILE__} + ":" + to_string(__LINE__)
+                     + ": unknown keytype returned from fits_parse_template");
+   }
+
+   return keytype_text;
+}
+
+bool is_key_in_header(const string header, const string key8)
+{
+   string::size_type ccount;
+   for(ccount = 0; ccount < header.length(); ccount += 80)//FLEN_CARD)
+   {
+      string curr_key = header.substr(ccount, 8);
+      //LOG_STREAM << key8 << " " <<curr_key << endl;
+      if(curr_key.compare(key8) == 0) return true;
+   }
+   return false;
+}
+
+/* header string must not contain closing END-card */
+string fitsfiles::append_card_if_not_in_header(string header, const vector<fits_card> additional_cards)
+{
+   LOG_trace(__func__);
+
+   string new_header_str{header};
+
+   LOG_STREAM << "header length: " << strlen(new_header_str.c_str()) << endl;
+
+   for(fits_card new_card : additional_cards)
+   {
+      int status = 0;
+      int keytype;
+      const int BUFF_SIZE{ 1024 };
+      char buff[BUFF_SIZE];
+      char card[FLEN_CARD];
+      card[0] = 0;
+
+      const string newcard_template{ new_card.key + " = " + new_card.value + " / " + new_card.comment };
+      my_assert(newcard_template.length() < BUFF_SIZE, __FILE__, __LINE__, "formatted card does not fit into buffer");
+      strcpy(buff, newcard_template.c_str());
+
+      if(!fits_parse_template(buff, card, &keytype, &status))
+      {
+         string card_key{ string{card}.substr(0,8) };
+         if( is_key_in_header(header, card_key) )
+         {
+            LOG_STREAM << __func__ << " found card[" << card_key  << "]: " << card << endl;
+         }
+         else
+         {
+            LOG_STREAM << __func__ << " appending card[" << keytype  << "] >" << card
+               << "< " << keytype_to_string2(keytype) << endl;
+
+            new_header_str += string{ card };
+         }
+      }
+      else
+      {
+         throw runtime_error(cfitsio_errmsg(__FILE__,__LINE__,status) + " key: " + new_card.key);
+      }
+
+   }
+
+   LOG_STREAM << "header length: " << strlen(new_header_str.c_str()) << endl;
+
+   return new_header_str;
+}
+
+
+
+// deprecated
+
+vector<fitsfiles::Hdu> fitsfiles::fname2hdrstr(
+      string filename, unsigned int maxHduPos,
+      const keys_by_type* keys)
+{
+   LOG_trace(__func__);
+   LOG_STREAM << filename << endl;
+
+   vector<fitsfiles::Hdu> vec_hdrs;
+
+   // FIXME enhance fits::header to avoid open the file at each cycle
+   unsigned int hdupos;
+   for(hdupos = 1; hdupos <= maxHduPos; hdupos++)
+   {
+      fits::header hdr(filename, hdupos);
+      // FIXME catch run-time except if not IMAGE_HDU -> how to deal with this?
+      // and if hdupos > last_hdu ? ->
+      // implement read_all_hdus without prior knowledge how many HDU are in FITSfile:
+      // read to file-end but ignore (only) FILE-end error
+
+      const bool apply_fixes = true;
+      string header_str{hdr.get_header(apply_fixes)};
+
+      key_values_by_type key_values;
+
+      if(keys != nullptr)
+      {
+         key_values = hdr.parse_cards(*keys);
+      }
+
+      vec_hdrs.push_back( fitsfiles::Hdu{hdupos, header_str, key_values} );
+   }
+
+   return vec_hdrs;
+}
+
diff --git a/data-access/engine/src/common/src/fix_header.cpp b/data-access/engine/src/common/src/fix_header.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..7cfd1f45b4455f68f79355573973f58a130479c4
--- /dev/null
+++ b/data-access/engine/src/common/src/fix_header.cpp
@@ -0,0 +1,47 @@
+
+#include <string.h>
+
+#include "fix_header.hpp"
+#include "io.hpp"
+
+char * fix_header(char * header)
+{
+   LOG_trace(__func__);
+
+   char * p;
+
+
+   /* Fix 1 VELO-LSR not standard by PaperIII
+    * it is FITS-CLASS encoding from 'CLASS' sw package: AST manual p 438..439
+    * AST man p438: FITS-CLASS encoding recognized if VELO-LST & DELTAV keyword present
+    * Then AST gives error msg:
+    * ! astRead(FitsChan): FITS-CLASS keyword CTYPE3 has value "VELO-LSR" - CLASS support in AST only includes "FREQ" axes.
+    * FIXME LSR should go to new card: SPECSYS = LSRK
+    * VELO-LSR at least in HI_VGPS MOS_017*
+    * ast-8.2.0 have fixed this
+    * if uodated to ast-8.4.0 so remove VELO-LSR replacement
+    */ 
+   p = strstr(header, "VELO-LSR");
+   if(p)strncpy(p,"VELO    ",8);
+
+
+   /* HI_VGPS encodes rest frq as FREQ0 :
+    * FIXME how to deal with this ? other surveys nay use FREQ0 for something else
+    * p = strstr(header, "FREQ0   ");
+    * if(p)strncpy(p,"RESTFRQ ",8);
+    */
+
+
+   // Fix 2 VELOCITY not standard by PaperIII
+   p = strstr(header, "VELOCITY");
+   if(p)strncpy(p,"VELO    ",8);
+
+
+   // Fix 3 velocity axis unit capital, should be small letters (?see PaperIII) HI_VGPS MOS_017
+   p = strstr(header, "M/S");
+   if(p)strncpy(p,"m/s",3);
+
+
+   return header;
+}
+
diff --git a/data-access/engine/src/common/src/fix_header.hpp b/data-access/engine/src/common/src/fix_header.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..03fc8d81cb9dc038d385f9bfce23e77df2ec647e
--- /dev/null
+++ b/data-access/engine/src/common/src/fix_header.hpp
@@ -0,0 +1,10 @@
+
+#ifndef FIX_HEADER_HPP
+#define FIX_HEADER_HPP
+
+/* implement fixes to nonstandard fits-header */
+
+char * fix_header(char * header);
+
+#endif
+
diff --git a/data-access/engine/src/common/src/io.cpp b/data-access/engine/src/common/src/io.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..c0346c682ba85ba19c9cb8ab9e99f3a791e71979
--- /dev/null
+++ b/data-access/engine/src/common/src/io.cpp
@@ -0,0 +1,53 @@
+
+#include "io.hpp"
+
+#include <string>
+#include <fstream>
+#include <chrono>
+#include <ctime>  
+
+#include <unistd.h> // getpid  
+#include <sys/types.h> // pid_t ?
+#include <sys/syscall.h> // syscall NR_gettid
+
+using namespace std;
+
+std::ofstream LOG_STREAM;
+
+
+
+void LOG_open(string log_dir, string log_filename)
+{
+	//pid_t pid = getpid();
+	//long int tid = syscall(__NR_gettid);
+
+	std::string log_pathname = log_dir + "/" + log_filename;
+	//std::string log_pathname = log_dir + "/" + log_filename + "-" + to_string(pid) + "-" + to_string(tid);
+	LOG_STREAM.open(log_pathname, std::ios::out);
+	if (LOG_STREAM) 
+	{
+		LOG_STREAM << "LOG_start";
+		auto nnn = std::chrono::system_clock::now();
+		std::time_t now_time = std::chrono::system_clock::to_time_t(nnn);
+		LOG_STREAM << " " << ctime(&now_time) << std::endl;
+	}
+}
+
+
+void LOG_close()
+{ 
+	if (LOG_STREAM) 
+	{
+		LOG_STREAM << "LOG___end";
+		auto nnn = std::chrono::system_clock::now();
+		std::time_t now_time = std::chrono::system_clock::to_time_t(nnn);
+		LOG_STREAM << " " << ctime(&now_time) << std::endl;
+		LOG_STREAM.close();
+	}
+}
+
+void LOG_trace(const std::string line)
+{
+	if(LOG_STREAM) LOG_STREAM << "TRC " + line << std::endl;
+}
+
diff --git a/data-access/engine/src/common/src/json_request.cpp b/data-access/engine/src/common/src/json_request.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..60574afd64d378041e5a8f3f340f044fad0962e8
--- /dev/null
+++ b/data-access/engine/src/common/src/json_request.cpp
@@ -0,0 +1,80 @@
+
+
+#include "json_request.hpp"
+#include "io.hpp"
+
+#include "cutout.hpp"
+#include "cutout_nljson.hpp"
+#include "mcutout.hpp"
+#include "mcutout_nljson.hpp"
+
+
+#include <stdexcept>
+#include "json.hpp"
+
+#include <iostream>
+
+using json = nlohmann::json;
+const bool ASSERTS = true;
+
+using namespace std;
+
+
+
+
+NLOHMANN_JSON_SERIALIZE_ENUM( service, {
+      {MCUTOUT,"MCUTOUT"},
+      {MERGEF, "MERGEF"},
+      {MERGE1, "MERGE1"},
+      {MERGE2, "MERGE2"},
+      {MERGE3, "MERGE3"},
+      {SUBIMG, "SUBIMG"}
+      });
+
+
+json_request::json_request(string request_json)
+{
+   LOG_trace(__func__);
+
+   m_jservice = json::parse(request_json, nullptr, ASSERTS);
+   m_service = m_jservice.at("service");
+
+   LOG_STREAM << m_jservice.dump() << endl;
+}
+
+struct coordinates json_request::coordinates() {return m_jservice.at("coordinates");}
+
+vector<struct cut_param_s> json_request::cut_params()
+{
+   LOG_trace(__func__);
+
+   vector<struct cut_param_s> cut_pars;
+   json cuts = m_jservice.at("cuts");
+   cut_pars = cuts.get<std::vector<struct cut_param_s>>();
+
+   return cut_pars;
+}
+
+std::vector<struct fits_card> json_request::extra_cards()
+{
+   vector<struct fits_card> cards;
+   if(m_jservice.contains("extra_cards"))
+   {
+      json jcards = m_jservice.at("extra_cards");
+      cards = jcards.get<std::vector<struct fits_card>>();
+   }
+   return cards;
+}
+
+
+std::vector<std::string> json_request::get_pol()
+{
+   vector<string> str;
+   if(m_jservice.contains("pol"))
+   {
+      json j = m_jservice.at("pol");
+      str = j.get<std::vector<string>>();
+   }
+   return str;
+}
+
diff --git a/data-access/engine/src/common/src/m4vl.cpp b/data-access/engine/src/common/src/m4vl.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..33a1c0912eb2aa56b37024a0beb6d75ef46e7b69
--- /dev/null
+++ b/data-access/engine/src/common/src/m4vl.cpp
@@ -0,0 +1,644 @@
+
+
+// remove subtree
+#define _XOPEN_SOURCE 500
+#include <ftw.h> // tree walk
+#include <unistd.h>
+
+#include <errno.h>
+#include <stdio.h>
+#include <stdlib.h> // strtol
+#include <string.h> // strsep
+#include <sys/stat.h> // dir handling
+#include <dirent.h> // dir handling
+#include <libgen.h> // basename
+#include <limits.h> //OPEN_MAX
+
+#include <linux/limits.h> // PATH_MAX NAME_MAX
+
+#include "m4vl.hpp"
+//#include "utils.h"
+//#include "dbg.h"
+#include "io.hpp"
+
+
+#define cleanup_and_return(xmpath, rrcc) {rmrf(xmpath);return rc;}
+
+extern char * usec_timestamp(char * ts, size_t ts_len); // FIXME from cutout.cpp
+
+/* three below to get ThreadID & getpid */
+#include <sys/types.h>
+#include <sys/syscall.h>
+#include <unistd.h>
+
+using namespace std;
+
+char * get_pid_tid(char pidtidstr[NAME_MAX])
+{
+   long int tid = syscall(__NR_gettid); // ThreadID
+   sprintf(pidtidstr,"%d-%ld",getpid(),tid);
+
+   return pidtidstr;
+}
+
+
+
+
+
+int rmrf_unlink_cb(const char *fpath, const struct stat */*sb*/, int /*typeflag*/, struct FTW */*ftwbuf*/)
+{
+   //DBG_PRINTF("%s: removing: %s\n",__func__,fpath);
+   LOG_STREAM << string{__func__} + ": removing: " + string{fpath} << endl;
+
+   int rv = remove(fpath);
+
+   if (rv) {
+      //DBG_PRINTF("%s: remove: %s for %s\n",__func__,strerror(errno),fpath);
+      LOG_STREAM << string{__func__} + ": remove: " + string{strerror(errno)} + " for " + string{fpath} << endl;
+   }
+
+   return rv;
+}
+int rmrf(char *path)
+{
+   return nftw(path, rmrf_unlink_cb, 64, FTW_DEPTH | FTW_PHYS);
+}
+
+// the popen does not catch stderr only stdout. Redirect stderr to stdout.
+// how about threadsafety ?? --> popen
+// returns 0 if Ok
+//         non-zero on failure
+int exec_montage_cmd(const char * argcmd) {
+
+   LOG_trace(__func__);
+
+   int rc = 0;
+   FILE *fp;
+#define BUFSIZE 128
+   char buf[BUFSIZE];
+
+   char cmd[1024];//FIXME length
+   sprintf(cmd,"{ time %s ; } 2>&1",argcmd);
+
+   //DBG_PRINTF("%s: %s\n",__func__,cmd);
+   LOG_STREAM << __func__ << ": " << cmd << endl;
+
+   if ((fp = popen(cmd, "r")) == NULL) {
+      //DBG_PRINTF("Error opening pipe, or !\n");
+      LOG_STREAM << "Error opening pipe, or !" << endl;
+      rc = -10;
+      goto f_end;
+   }
+
+   while (fgets(buf, BUFSIZE, fp) != NULL) {
+      // Do whatever you want here...
+      //DBG_PRINTF("%s: OUTPUT: %s\n", __func__, buf);
+      LOG_STREAM << __func__ << ": OUTPUT: " << buf << endl;
+   }
+
+   // NOTE seems like Montage-commands 
+   // in case of error   exit with status 1 and write [struct stat="ERROR", msg % ...]
+   // in case of success exit with status 0 and write [struct stat="OK", msg % ...]
+
+
+   // int pclose() - returns exit status of the cmd as given by wait4()/waitpid()
+   // from net: pclose() return -1 if there was
+   // an error obtaining the command's exit status, otherwise it
+   // returns the exit status itself.
+   // As commands traditionally set exit status 0 to indicate success,
+   // by checking if(pclose(fp)) we're testing for errors in either pclose()
+   // or the command.
+   int clrc;
+   if ((clrc = pclose(fp))) {
+      if(clrc == -1) {
+         // DBG_PRINTF("%s: pclose: Command not found: %s\n",__func__,cmd);
+   	 LOG_STREAM << __func__ << ": pclose: Command not found: " << cmd << endl;
+         rc = -20;
+      }
+      else {
+         //DBG_PRINTF("%s: pclose: Command exited with error status: %d\n",__func__,clrc);
+   	 LOG_STREAM << __func__ << ": pclose: Command exited with error status:: " << clrc << endl;
+         rc = -21;
+      }
+      goto f_end;
+   }
+
+f_end:
+   return rc;
+}
+
+void printdir(const char * dir) {
+
+   LOG_trace(__func__);
+
+   // DEBUG: list dir
+   DIR *dp;
+   struct dirent *ep;
+
+   dp = opendir (dir);
+   if (dp != NULL)
+   {
+      while ((ep = readdir (dp))){
+         // skip cur and parent dir
+         if( (strncmp( ep->d_name, ".",  1 ) == 0) ||
+               (strncmp( ep->d_name, "..", 2 ) == 0) ) {
+            continue;
+         }
+         //DBG_PRINTF("%s[%s]: %s\n",__func__,dir,ep->d_name);
+   	 LOG_STREAM << __func__ << "[" << dir <<  "]: " << ep->d_name << endl;
+      }
+      (void) closedir (dp);
+   }
+   else {
+      //DBG_PRINTF("%s: opendir: %s for %s\n",__func__,strerror(errno),dir);
+      LOG_STREAM << __func__ << ": opendir: " << strerror(errno) << " for " << dir << endl;
+   }
+}
+
+// return 0 if merge successful, merged file created
+//        non-zero for errors
+int M4VL_mergefiles(struct merge_files * p, size_t nfiles, char * fitsfiles[], char * argmerged, size_t argmerged_len) {
+
+   LOG_trace(__func__);
+
+
+   int rc = 0;
+   // max possible path name length
+   //#define MPATHLEN PATH_MAX
+   char mdir[MPATHLEN];// to be cretaed by timestamp and pid+tid
+#define TS_LEN (256)
+   char ts[TS_LEN];
+   char pidtidstr[NAME_MAX];
+   sprintf(mdir, "MERGE_%s_%s",
+         usec_timestamp(ts,TS_LEN),
+         get_pid_tid(pidtidstr));
+
+   // global strings (for all merge commands)
+   // subtree where montage works
+   char mpath[MPATHLEN+1];
+   sprintf(mpath,"%s/%s",p->mroot,mdir);
+   // dir where input fits files are
+   char mpathin[MPATHLEN + 4];
+   sprintf(mpathin,"%s/%s",mpath,"in");
+   // dir where montage places re-projected fits files
+   char mpathproj[MPATHLEN + 8];
+   sprintf(mpathproj,"%s/%s",mpath,"proj");
+
+   if(mkdir(mpath, S_IRWXU)){
+      // FIXME threadsafe is strerror_r()
+      //DBG_PRINTF("%s: mkdir: %s for %s\n",__func__,strerror(errno),mpath);
+      LOG_STREAM << __func__ << ": mkdir: " << strerror(errno) << " for " << mpath << endl;
+      rc = -10;
+      return rc;
+   }
+   if(mkdir(mpathin, S_IRWXU)){
+      //DBG_PRINTF("%s: mkdir: %s for %s\n",__func__,strerror(errno),mpathin);
+      LOG_STREAM << __func__ << ": mkdir: " << strerror(errno) << " for " << mpathin << endl;
+      rc = -11;
+      cleanup_and_return(mpath, rc);
+   }
+   if(mkdir(mpathproj, S_IRWXU)){
+      //DBG_PRINTF("%s: mkdir: %s for %s\n",__func__,strerror(errno),mpathproj);
+      LOG_STREAM << __func__ << ": mkdir: " << strerror(errno) << " for " << mpathproj << endl;
+      rc = -12;
+      cleanup_and_return(mpath, rc);
+   }
+
+   // make a link of files to be merged into the merge in-dir
+   size_t i;
+   for (i=0;i<nfiles;i++){
+
+      char fn[MPATHLEN + MFNAMELEN];
+      strcpy(fn, fitsfiles[i]);
+
+      //DBG_PRINTF("%s: symlink src : %s\n",__func__,fn);
+      LOG_STREAM << __func__ << ": symlink src: " << fn << endl;
+
+      char destpath[MPATHLEN + 8];  
+      snprintf (destpath, sizeof(destpath), "%s/%s", mpathin, basename(fn));
+
+      //DBG_PRINTF("%s: symlink dest: %s\n",__func__,destpath);
+      LOG_STREAM << __func__ << ": symlink src: " << destpath << endl;
+
+      if(symlink(fn,destpath)) {
+         // DBG_PRINTF("%s: symlink: %s\n",__func__,strerror(errno));
+         LOG_STREAM << __func__ << ": symlink: " << strerror(errno) << endl;
+         rc = -20;
+         cleanup_and_return(mpath, rc);
+      }
+   }
+   printdir(mpath);
+   printdir(mpathin);
+
+   // NAXIS from vlkbif.java::merge
+   long dim = strtol(p->prefix,NULL,10);
+   //DBG_PRINTF("%s: merge dim(%s): %ld\n",__func__,p->prefix,dim);
+   LOG_STREAM << __func__ << ": merge dim(" << p->prefix << "): " << dim << endl;
+
+   //
+   // 1 create common header
+   //
+   //#define MFNAMELEN NAME_MAX
+   char intbl[MPATHLEN + MFNAMELEN];
+   sprintf(intbl,"%s/inputs.tbl",mpath);
+
+   // FIXME assume 5 params which could be full pathnames
+#define MPARAMSCNT 5
+#define MCMDLEN (MPARAMSCNT*(MPATHLEN + MFNAMELEN))
+   // current merge command
+   int mrc = 0;
+   char mcmd[MCMDLEN];
+   sprintf(mcmd,"mImgtbl %s %s",mpathin,intbl);
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -30;
+      cleanup_and_return(mpath, rc);
+   }
+   printdir(mpath);
+
+   char cmnhdr[MPATHLEN + MFNAMELEN];
+   sprintf(cmnhdr,"%s/common.hdr",mpath);
+   sprintf(mcmd,"mMakeHdr %s %s %s",intbl,cmnhdr,"GAL");
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -31;
+      cleanup_and_return(mpath, rc);
+   }
+   printdir(mpath);
+   //    system("cat /tmp/MERGECTEST/common.hdr");
+
+   //
+   // 2 reproject all input files
+   //
+   // DEBUG: list dir --> or use scandir() ??
+   DIR *dp;
+   struct dirent *ep;
+
+   dp = opendir (mpathin);
+   if (dp == NULL)  {
+      //DBG_PRINTF("%s: opendir: %s\n",__func__,strerror(errno));
+      LOG_STREAM << __func__ << ": opendir: " << strerror(errno) << endl;
+      rc = -31;
+      cleanup_and_return(mpath, rc);
+   }
+
+   while ((ep = readdir (dp))){
+      // skip cur and parent dir
+      if( (strncmp( ep->d_name, ".",  1 ) == 0) ||
+            (strncmp( ep->d_name, "..", 2 ) == 0) ) {
+         continue;
+      }
+      //DBG_PRINTF("%s[%s]: %s\n",__func__,mpathin, ep->d_name);
+      LOG_STREAM << __func__ << "[" << mpathin  <<"]: " << ep->d_name << endl;
+
+      char inf[MPATHLEN + MFNAMELEN + 8];
+      sprintf(inf,"%s/%s",mpathin,ep->d_name);
+      char projf[MPATHLEN + MFNAMELEN + 16];
+      sprintf(projf,"%s/%s-proj",mpathproj,ep->d_name);
+
+      if(dim == 2) {
+         sprintf(mcmd,"mProjectQL %s %s %s",inf,projf,cmnhdr);
+      } else {
+         sprintf(mcmd,"mProjectCube %s %s %s",inf,projf,cmnhdr);
+      }
+
+      if((mrc = exec_montage_cmd(mcmd))) {
+         rc = -32;
+         cleanup_and_return(mpath, rc);
+      }
+   }
+   (void) closedir (dp);
+
+   printdir(mpathproj);
+
+   //
+   // 3. merge projected cubes
+   //
+   //  String projtbl = FITScutpath + mergedir + "/projected.tbl";
+   //  String[] mImgtblproj = {"mImgtbl", projpath, projtbl};
+   //  result.add(execMontage(mImgtblproj));
+   char projtbl[MPATHLEN + MFNAMELEN];
+   sprintf(projtbl,"%s/projected.tbl",mpath);
+
+   sprintf(mcmd,"mImgtbl %s %s",mpathproj,projtbl);
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -33;
+      cleanup_and_return(mpath, rc);
+   }
+   printdir(mpath);
+   //
+   // FIXME2: in Java result.add() was returning a string with full montage command and its returncode and stdout/err
+   //    String returnStr = exitVal + " " + cmdoutput + " fullCmd: " + fullCmd;
+   //
+   //  exec_montage_cmd() should do similar:
+   //
+   //  int rc exec_montage_cmd(...,struct execMontage * pexit) where:
+   //  struct execMontage { int cmdstatus; char * cmdoutput; char * fullcmd }
+   //  and rc is error code of the execMonatge() call itself (not that of argcmd)
+   //
+   //
+   //    String[] mAddCube = {"mAddCube", "-p", projpath, projtbl, cmnhdr, mergedfile};
+   //    result.add(execMontage(mAddCube));
+   char merged[MPATHLEN + MFNAMELEN];
+   // strip terminating slash if exist
+   if( p->mresdir[strlen(p->mresdir)-1] == '/'){
+      sprintf(merged,"%svlkb-merged_%sD_%s_%s.fits",p->mresdir,p->prefix,ts,pidtidstr);
+   }else{
+      sprintf(merged,"%s/vlkb-merged_%sD_%s_%s.fits",p->mresdir,p->prefix,ts,pidtidstr);
+   }
+   if(dim == 2){
+      // John Good/Montage: mAdd with -n if mProjectQL used
+      sprintf(mcmd,"mAdd -n -p %s %s %s %s",mpathproj,projtbl,cmnhdr,merged);
+   } else {
+      sprintf(mcmd,"mAddCube -p %s %s %s %s",mpathproj,projtbl,cmnhdr,merged);
+   }
+
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -34;
+      cleanup_and_return(mpath, rc);
+   }
+   printdir(mpath);
+
+   // FIXME this is terrible !!
+   for(i=0;i<argmerged_len;i++) argmerged[i] = 0;
+   strncpy(argmerged,merged,argmerged_len);
+
+   //DBG_PRINTF("%s: removing subtree...\n",__func__);
+   LOG_STREAM << __func__ << ": removing subtree..." << endl;
+   rmrf(mpath);
+   return rc;
+}
+
+
+
+/////////////////////////////////////////////////////////////////////////////////////////////////////////////////
+// parallelizing:
+// split M4VL_mergefiles into 3 steps
+// * create common header        M4VL_mergefiles_common_header( ... OUT: mpath)
+// * re-project one input file to common header M4VL_mergefiles_reprojection( ..., IN: mpath, fileA.fits, ... )
+// * merge together reprojected files     M4VL_mergefiles_add_reprojected(..., IN: mpath, ... )
+
+void merge_config_print(const struct merge_config * s)
+{
+#if 0
+   DBG_PRINTF("%s: %s\n",__func__,s->mpath);
+   DBG_PRINTF("%s: %s\n",__func__,s->mpathin);
+   DBG_PRINTF("%s: %s\n",__func__,s->mpathproj);
+   DBG_PRINTF("%s: %s\n",__func__,s->cmnhdr);
+   DBG_PRINTF("%s: %s\n",__func__,s->mresdir);
+   DBG_PRINTF("%s: %s\n",__func__,s->merged);
+#else
+   LOG_STREAM << __func__ << ": " << s->mpath << endl;
+   LOG_STREAM << __func__ << ": " << s->mpathin << endl;
+   LOG_STREAM << __func__ << ": " << s->mpathproj << endl;
+   LOG_STREAM << __func__ << ": " << s->cmnhdr << endl;
+   LOG_STREAM << __func__ << ": " << s->mresdir << endl;
+   LOG_STREAM << __func__ << ": " << s->merged << endl;
+#endif
+}
+
+
+// path composition:
+// <mroot>/MERGE_<timstamp>_<pidtid>/{in,proj}
+void M4VL_merge_config_init(
+      const char * jobid,    // IN some identifier to separate paraallel merrge requests
+      const char * mroot,    // IN dir where merge will create temporary working dirs
+      const char * mresdir,  // IN result merged file put to this dir
+      unsigned long dim,     // IN dimensionalty of data (2D | 3D)
+      struct merge_config * s) // OUT
+{
+   strcpy(s->mroot,    mroot);
+   strcpy(s->mresdir,  mresdir);
+   s->dim = dim;
+
+   char mdir[MPATHLEN];// to be created by timestamp and pid+tid
+   /*    sprintf(mdir, "MERGE_%s_%s",
+         usec_timestamp(s->ts,TS_LEN),
+         get_pid_tid(s->pidtidstr));
+         */    sprintf(mdir, "MERGE_%s",jobid);
+
+
+   sprintf(s->mpath,      "%s/%s", s->mroot,  mdir);
+
+   sprintf(s->mpathin,    "%s/%s", s->mpath,  "in");
+   sprintf(s->mpathproj,  "%s/%s", s->mpath,  "proj");
+
+
+   sprintf(s->cmnhdr,"%s/common.hdr",s->mpath);
+
+   //char merged[MPATHLEN + MFNAMELEN];
+   // strip terminating slash if exist
+   if( s->mresdir[strlen(s->mresdir)-1] == '/'){
+      sprintf(s->merged,"%svlkb-merged_%luD_%s.fits",s->mresdir,s->dim,jobid);
+      //sprintf(s->merged,"%svlkb-merged_%luD_%s_%s.fits",s->mresdir,s->dim,s->ts,s->pidtidstr);
+   }else{
+      sprintf(s->merged,"%s/vlkb-merged_%luD_%s.fits",s->mresdir,s->dim,jobid);
+      //sprintf(s->merged,"%s/vlkb-merged_%luD_%s_%s.fits",s->mresdir,s->dim,s->ts,s->pidtidstr);
+   }
+
+   merge_config_print(s);
+}
+
+
+// return 0 if common header successfully generated
+//        non-zero for errors
+int M4VL_mergefiles_common_header(
+      struct merge_config * s,               // IN  config (working dir & file names)
+      size_t nfiles, const char * fitsfiles[])  // IN  pathnames of FITS-files to merge
+{
+   LOG_trace(__func__);
+
+   int rc = 0;
+
+   if(mkdir(s->mpath, S_IRWXU)){
+      // FIXME threadsafe is strerror_r()
+      //DBG_PRINTF("%s: mkdir: %s for %s\n",__func__,strerror(errno),s->mpath);
+      LOG_STREAM << __func__ << ": mkdir: " << strerror(errno) << " for " << s->mpath << endl;
+      rc = -10;
+      return rc;
+   }
+   if(mkdir(s->mpathin, S_IRWXU)){
+      //DBG_PRINTF("%s: mkdir: %s for %s\n",__func__,strerror(errno),s->mpathin);
+      LOG_STREAM << __func__ << ": mkdir: " << strerror(errno) << " for " << s->mpathin << endl;
+      rc = -11;
+      cleanup_and_return(s->mpath,rc);
+   }
+   if(mkdir(s->mpathproj, S_IRWXU)){
+      //DBG_PRINTF("%s: mkdir: %s for %s\n",__func__,strerror(errno),s->mpathproj);
+      LOG_STREAM << __func__ << ": mkdir: " << strerror(errno) << " for " << s->mpathproj << endl;
+      rc = -12;
+      cleanup_and_return(s->mpath,rc);
+   }
+
+   // make a link of files to be merged into the merge in-dir
+   size_t i;
+   for (i=0;i<nfiles;i++){
+
+      const char * fn = fitsfiles[i];
+
+      //DBG_PRINTF("%s: symlink src : %s\n",__func__,fn);
+      LOG_STREAM << __func__ << ": symlink src: " << fn << endl;
+
+      char destpath[MPATHLEN + 4];  
+      // FIXME be consequent: sprintf xor snprintf ?!
+      char base_fn[2048 + 16];
+      strcpy(base_fn,fn);
+      snprintf (destpath, sizeof(destpath), "%s/%s", s->mpathin, basename(base_fn));
+
+      //DBG_PRINTF("%s: symlink dest: %s\n",__func__,destpath);
+      LOG_STREAM << __func__ << ": symlink dest: " << destpath << endl;
+
+      if(symlink(fn,destpath)) {
+         //DBG_PRINTF("%s: symlink: %s\n",__func__,strerror(errno));
+         LOG_STREAM << __func__ << ": symlink: " << strerror(errno) << endl;
+         rc = -20;
+         cleanup_and_return(s->mpath,rc);
+      }
+   }
+   printdir(s->mpath);
+   printdir(s->mpathin);
+
+   //
+   // 1 create common header
+   //
+#define MFNAMELEN NAME_MAX
+   char intbl[MPATHLEN + MFNAMELEN];
+   sprintf(intbl,"%s/inputs.tbl",s->mpath);
+
+   // current merge command
+   int mrc = 0;
+   char mcmd[MCMDLEN];
+   sprintf(mcmd,"mImgtbl %s %s",s->mpathin,intbl);
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -30;
+      cleanup_and_return(s->mpath,rc);
+   }
+   printdir(s->mpath);
+
+   sprintf(mcmd,"mMakeHdr %s %s %s",intbl,s->cmnhdr,"GAL");
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -31;
+      cleanup_and_return(s->mpath,rc);
+   }
+   printdir(s->mpath);
+   //    system("cat /tmp/MERGECTEST/common.hdr");
+
+   return rc;
+}
+
+
+
+// return 0 if success
+//        non-zero for errors
+int M4VL_mergefiles_reproject(
+      struct merge_config * s, // IN config (working dir & file names)
+      const char * fitsfile)   // IN  pathnames of FITS-files to merge
+{
+   LOG_trace(__func__);
+
+   int rc = 0;
+
+   char base_fn[MPATHLEN + MFNAMELEN];
+   strcpy(base_fn, fitsfile);
+
+   char inf[2*(MPATHLEN + MFNAMELEN)];
+   sprintf(inf,"%s/%s", s->mpathin, base_fn);
+
+   char projf[2*(MPATHLEN + MFNAMELEN)];
+   sprintf(projf,"%s/%s-proj", s->mpathproj, base_fn);
+
+   char mcmd[MCMDLEN + 512];
+   int mrc;
+
+   if(s->dim == 2) {
+      sprintf(mcmd,"mProjectQL %s %s %s", inf, projf, s->cmnhdr);
+   } else {
+      sprintf(mcmd,"mProjectCube %s %s %s", inf, projf, s->cmnhdr);
+   }
+
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -32;
+      cleanup_and_return(s->mpath,rc);
+   }
+
+   return rc;
+}
+
+
+// return 0 if success
+//        non-zero for errors
+int M4VL_mergefiles_add_reprojected(
+      struct merge_config * s,               // IN config (working dir & file names)
+      char * argmpath, size_t argmpath_maxlen)  // OUT pathname of merged FTS-file (up to max name length)
+{
+   LOG_trace(__func__);   
+
+   int rc = 0;
+
+   char projtbl[MPATHLEN + MFNAMELEN];
+   sprintf(projtbl,"%s/projected.tbl",s->mpath);
+
+   char mcmd[MCMDLEN];
+   int mrc;
+
+   sprintf(mcmd,"mImgtbl %s %s",s->mpathproj,projtbl);
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -33;
+      cleanup_and_return(s->mpath,rc);
+   }
+   printdir(s->mpath);
+
+   if(s->dim == 2){
+      // John Good/Montage: mAdd with -n if mProjectQL used
+      sprintf(mcmd,"mAdd -n -p %s %s %s %s",s->mpathproj,projtbl,s->cmnhdr,s->merged);
+   } else {
+      sprintf(mcmd,"mAddCube -p %s %s %s %s",s->mpathproj,projtbl,s->cmnhdr,s->merged);
+   }
+
+   if((mrc = exec_montage_cmd(mcmd))) {
+      rc = -34;
+      cleanup_and_return(s->mpath,rc);
+   }
+   printdir(s->mpath);
+
+   // FIXME this is terrible !!
+   size_t i;
+   for(i=0;i<argmpath_maxlen;i++) argmpath[i] = 0;
+   strncpy(argmpath,s->merged,argmpath_maxlen);
+
+   //DBG_PRINTF("%s: removing subtree...\n",__func__);
+   LOG_STREAM << __func__ << ": removing subtree..." << endl;
+   rmrf(s->mpath);
+   return rc;
+}
+
+
+// run split variant
+
+// return 0 if merge successful, merged file created
+//        non-zero for errors
+int M4VL_mergefiles_split(struct merge_files * p, size_t nfiles, char * fitsfiles[], char * argmerged, size_t argmerged_len)
+{
+   int rc = 0;
+
+   unsigned long dim = strtol(p->prefix,NULL,10);
+   // FIXME prefix was (mis)used to carry dimensionality
+
+   struct merge_config mconfig;
+
+   M4VL_merge_config_init("DUMMYJOBD",p->mroot, p->mresdir, dim, &mconfig);
+   merge_config_print(&mconfig);
+
+   rc = M4VL_mergefiles_common_header(&mconfig, nfiles, (const char**)fitsfiles);
+
+   size_t i;
+   for( i = 0; i<nfiles; i++)
+   {
+      rc = M4VL_mergefiles_reproject(&mconfig, basename(fitsfiles[i]));
+      if(rc) cleanup_and_return(mconfig.mpath, rc);
+   }
+
+   rc = M4VL_mergefiles_add_reprojected(&mconfig, argmerged, argmerged_len);
+
+   return rc;
+}
+
+
diff --git a/data-access/engine/src/common/src/m4vl.hpp b/data-access/engine/src/common/src/m4vl.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..c568e8863661c96bc52fff46b8eb3fe17c8500bb
--- /dev/null
+++ b/data-access/engine/src/common/src/m4vl.hpp
@@ -0,0 +1,61 @@
+#ifndef M4VL_HPP
+#define M4VL_HPP
+
+#include <sys/param.h> // NAME_MAX PATH_MAX
+
+
+struct merge_files {
+    const char * mroot;      // dir where to create temporary working dir for merge
+    const char * mresdir;    // dir where to place resulting merged file
+    const char * prefix;     // merge filename prefix after "vlkb-merged_" supplied by client (optional)
+};
+
+//FIXME review pointers which should be const
+// fitsfiles - full path and filename
+int M4VL_mergefiles(struct merge_files * p, size_t nfiles, char * fitsfiles[], char * merged, size_t merged_len);
+
+int M4VL_mergefiles_split(struct merge_files * p, size_t nfiles, char * fitsfiles[], char * merged, size_t merged_len);
+
+#define TS_LEN (256)
+#define MPATHLEN PATH_MAX
+#define MFNAMELEN NAME_MAX
+struct merge_config
+{
+	unsigned long dim;
+	char ts[TS_LEN];
+	char pidtidstr[NAME_MAX];
+	
+	char mroot[MPATHLEN];
+        char mpath[MPATHLEN];
+        char mpathin[MPATHLEN];
+        char mpathproj[MPATHLEN];
+        char cmnhdr[MPATHLEN + MFNAMELEN];
+   char prefix[MPATHLEN]; // FIXME prefix was misused for dim: "2" "3" -> (2D 3D)
+   char mresdir[MPATHLEN];
+   char merged[MPATHLEN + MFNAMELEN];
+};
+
+void M4VL_merge_config_init(
+                const char * jobid,    // IN some identifier to separate paralel merge requests
+                const char * mroot,    // IN dir where merge will create temporary working dirs
+                const char * mresdir,  // IN result merged file put to this dir
+                unsigned long dim,     // IN dimensionalty of data (2D | 3D)
+                struct merge_config * s); // OUT
+
+
+
+int M4VL_mergefiles_common_header(
+		struct merge_config * s, 	// IN OUT internal state of merge
+		size_t nfiles, const char * fitsfiles[]);// IN pathnames to FITS-files to merge
+
+int M4VL_mergefiles_reproject(
+                struct merge_config * s, // IN OUT internal state of merge
+                const char * fitsfile);  // IN  pathname of one FITS-files to re-project
+
+int M4VL_mergefiles_add_reprojected(
+                struct merge_config * s,                 // IN OUT internal state of merge
+                char * argmpath, size_t argmpath_maxlen);  // OUT pathname of merged FTS-file (up to max name length)
+
+
+#endif
+
diff --git a/data-access/engine/src/common/src/mcutout.cpp b/data-access/engine/src/common/src/mcutout.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..4fd9024a0a7d04db43ab551073a17373c4c7ce45
--- /dev/null
+++ b/data-access/engine/src/common/src/mcutout.cpp
@@ -0,0 +1,155 @@
+
+#include "cutout.hpp"
+#include "mcutout.hpp"
+#include "fitsfiles.hpp"
+#include "fits_header.hpp"
+#include "ast4vl.hpp"
+#include "mcutout_nljson.hpp"
+#include "mcutout_ostream.hpp"
+#include "json.hpp"
+#include "io.hpp"
+#include "my_assert.hpp"
+
+#include <string.h>
+#include <string>
+#include <fstream> // ofstream for tgz-file
+#include <sstream>
+#include <stdexcept>
+// for timestamp
+#include <iomanip>
+#include <chrono>
+#include <ctime>
+
+/* create_timestamp */
+#include <time.h>
+#include <sys/time.h>
+
+
+
+using namespace std;
+using json = nlohmann::json;
+
+
+
+// mcutout
+
+
+
+string exec_targz(std::vector<cut_resp_s> resps, string targzdir) // FIXME find suitable libs and replace cmd-exec
+{
+   LOG_trace(__func__);
+   /* FIXME error check missing */
+
+   /* encode responses to json and add to tgz file */
+
+   json jjrr;
+   try
+   {
+      jjrr = resps;
+   }
+   catch(json::exception& e)
+   {
+      LOG_STREAM << "mcutout_json() resp : " << e.what() << std::endl;
+   }
+   string resps_json_str =  jjrr.dump();
+
+   /* add json string as file inside tar.gz */
+   const string resps_json_name = "response.json";
+   {
+      const string resps_json_pathname = targzdir + "/" + resps_json_name;
+      std::ofstream file_id;
+      file_id.open(resps_json_pathname);
+      file_id << resps_json_str;
+      file_id.close();
+   }
+   /* check tar and gzip are installed */
+
+   const string timestamp{create_timestamp()};
+
+   const string tgz_filename = "mcutout_" + timestamp + ".tar.gz";
+   const string tgzname = targzdir + "/" + tgz_filename;
+   string cmd = "tar cfz " + tgzname;
+   cmd.append(" -C " + targzdir);
+   cmd.append(" " + resps_json_name);
+   for(auto resp : resps)
+   {
+      if(resp.type == content_type::FILENAME)
+         cmd.append(" " + resp.content);
+   }
+
+   LOG_STREAM << "compress: " << cmd << endl;
+   system(cmd.c_str());
+
+   return tgz_filename;
+}
+
+
+struct cut_resp_s one_cutout(const struct cut_param_s cut, const string fits_path, const string fits_cut_path)
+{
+   LOG_trace(__func__);
+
+   LOG_STREAM << cut << endl;
+
+   struct cut_resp_s resp;
+   try
+   {
+      //uintmax_t filesize;
+      const string relative_pathname = (cut.filename); // FIXME config wrong is http://...
+      const string abs_fits_pathname = fits_path + '/' + relative_pathname;
+      const string cutfitsname = generate_cut_fitsname(relative_pathname, cut.hdunum);
+      /*filesize =*/ cutout_file(
+            abs_fits_pathname, cut.hdunum,
+            cut.coord,
+            cutfitsname,
+            cut.cards);
+
+      if(cut.countNullVals)
+      {
+         unsigned long long null_cnt;
+         unsigned long long total_cnt;
+         double fill_ratio  = fitsfiles::calc_nullvals(fits_cut_path + "/" + cutfitsname, 1/*hdunum*/,
+               null_cnt, total_cnt);
+         LOG_STREAM << "nullvals: " << fill_ratio << " : " << null_cnt << " : " << total_cnt << endl;
+      }
+
+      resp.type = content_type::FILENAME;
+      resp.content = cutfitsname;
+   }
+   catch (std::invalid_argument const& e)
+   {
+      resp.type = content_type::BAD_REQUEST;
+      resp.content = string("Invalid argument error: ").append(e.what());
+   }
+   catch (std::exception const& e)
+   {
+      resp.type = content_type::SERVICE_ERROR;
+      resp.content = string("System error: ").append(e.what());
+   }
+
+   return resp;
+}
+
+
+
+struct mcutout_res_s mcutout(vector<struct cut_param_s> cut_params, const string fits_path, const string fits_cut_path)
+{
+   LOG_trace(__func__);
+
+   std::vector<cut_resp_s> resps(cut_params.size());
+
+   unsigned int i = 0;
+   for(auto cut : cut_params)
+   {
+      cut_resp_s resp;
+      resp = one_cutout(cut, fits_path, fits_cut_path);
+      resp.input = cut;
+      resps[i++] = resp;
+   }
+
+   string tgzfilename = exec_targz(resps, fits_cut_path);
+
+   struct mcutout_res_s mresult{0,tgzfilename, resps};
+
+   return mresult;
+}
+
diff --git a/data-access/engine/src/common/src/mcutout_nljson.cpp b/data-access/engine/src/common/src/mcutout_nljson.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..78f0f753ee3403d935d3eb8e9f756c483ae100bd
--- /dev/null
+++ b/data-access/engine/src/common/src/mcutout_nljson.cpp
@@ -0,0 +1,82 @@
+
+#include "json.hpp"
+#include "io.hpp"
+#include "my_assert.hpp"
+#include "mcutout.hpp"
+#include "cutout_nljson.hpp" // needs json<->coordinates conversions
+#include "mcutout_nljson.hpp"
+
+#include <string>
+
+using namespace std;
+
+using json = nlohmann::json;
+
+NLOHMANN_JSON_SERIALIZE_ENUM( content_type, {
+      {content_type::FILENAME, "FILENAME"},
+      {content_type::BAD_REQUEST, "BAD_REQUEST"},
+      {content_type::SERVICE_ERROR, "SERVICE_ERROR"}});
+
+
+void to_json(json& j, const cut_param_s& p)
+{
+   j = json{
+      {"pubdid", p.pubdid},
+         {"coord", p.coord},
+         {"countNullVals", p.countNullVals},
+         {"filename", p.filename},
+         {"hdunum", p.hdunum},
+         {"extra_cards", p.cards}
+   };
+}
+
+
+void from_json(const json& j, cut_param_s& p) 
+{
+   j.at("pubdid").get_to(p.pubdid);
+
+   const json& jw = j.at("coord");
+   struct coordinates temp_coord;
+   from_json(jw, temp_coord);
+   p.coord = temp_coord;
+
+   if(j.contains("countNullVals"))
+      j.at("countNullVals").get_to(p.countNullVals);
+   else
+      p.countNullVals = false;
+
+   j.at("filename").get_to(p.filename);
+   j.at("hdunum").get_to(p.hdunum);
+
+   if(j.contains("extra_cards"))
+   {
+      json jcards = j.at("extra_cards");
+      p.cards = jcards.get<std::vector<struct fits_card>>();
+   }
+}
+
+
+
+void to_json(json& j, const mcutout_res_s& p)
+{
+   j = json{
+      {"filesize", p.filesize},
+         {"filename", p.tgz_filename},
+         {"responses", p.responses}  // arr of cut_resp_s
+   };
+}
+
+
+void to_json(json& j, const cut_resp_s& p)
+{
+   j = json{ {"input", p.input}, {"type", p.type}, {"content", p.content} };
+}
+
+void from_json(const json& j, cut_resp_s& p)
+{
+   j.at("input").get_to(p.input);  // cut_param_s
+   j.at("type").get_to(p.type);
+   j.at("content").get_to(p.content);
+}
+
+
diff --git a/data-access/engine/src/common/src/mcutout_ostream.cpp b/data-access/engine/src/common/src/mcutout_ostream.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..40400af02eb278360528508e13b357b508d5e45a
--- /dev/null
+++ b/data-access/engine/src/common/src/mcutout_ostream.cpp
@@ -0,0 +1,42 @@
+
+#include "cutout_ostream.hpp" // coordinates needed
+#include "mcutout_ostream.hpp"
+#include "cutout.hpp"
+#include "mcutout.hpp"
+#include "my_assert.hpp"
+
+#include <iomanip> // setw
+#include <iostream>
+#include <vector>
+
+using namespace std;
+
+
+std::string to_string(content_type ss)
+{
+   string str;
+   switch(ss)
+   {
+      case content_type::FILENAME: str = "FILENAME"; break;
+      case content_type::BAD_REQUEST: str = "BAD_REQUEST"; break;
+      case content_type::SERVICE_ERROR: str = "SERVICE_ERROR"; break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of content_type");
+   return str;
+}
+
+
+std::ostream& operator<<( std::ostream &out, struct ::cut_param_s const& p)
+{
+   out << std::setw(20) << p.pubdid << " " << p.coord << " " << std::boolalpha << p.countNullVals
+      << " " << p.filename << " " << p.hdunum << " extra_cards[" << p.cards.size() << "]";
+   return out;
+}
+
+std::ostream& operator<<( std::ostream &out, struct cut_resp_s const& p)
+{
+   out << std::setw(20) << to_string(p.type) << " " << p.content << " for input: " << p.input;
+   return out;
+}
+
+
diff --git a/data-access/engine/src/common/src/mergefiles.cpp b/data-access/engine/src/common/src/mergefiles.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..3c987fab79fb7665e566258fad037c781c9518af
--- /dev/null
+++ b/data-access/engine/src/common/src/mergefiles.cpp
@@ -0,0 +1,219 @@
+
+#include "mcutout.hpp"
+
+#include "m4vl.hpp"
+
+#include "fitsfiles.hpp"
+#include "io.hpp"
+
+#include <linux/limits.h> // PATH_MAX NAME_MAX
+#include <cstring>
+
+#include <iterator>
+#include <sstream>
+#include <iostream>
+#include <stdexcept>
+#include <string>
+
+// for timestamp
+#include <iomanip>
+#include <chrono>
+#include <ctime>
+
+#include <sys/stat.h> // for filesize
+
+using namespace std;
+
+
+string error_msg(const char * file, int line, string msg)
+{
+   return string{file} + ":" + to_string(line) + " " + msg;
+}
+
+
+
+unsigned long mergefilesize(const char * pathname)
+{
+   struct stat st; 
+   if(stat(pathname, &st))
+   {   
+      return 0;
+   };  
+   return st.st_size;
+}
+
+
+char** malloc_fitsfiles_array(const vector<string> fitsfiles, char  * arg_c_fitsfiles[])
+{
+   arg_c_fitsfiles = (char**)malloc(fitsfiles.size() * sizeof(char*));
+
+   int i=0;
+   for(string fits : fitsfiles)
+   {
+      arg_c_fitsfiles[i] = (char*)malloc(fitsfiles.at(i).length() * (sizeof(char)+1));
+      strcpy(arg_c_fitsfiles[i], fitsfiles.at(i).c_str());
+      i++;
+   }
+   return arg_c_fitsfiles;
+}
+
+void free_fitsfiles_array(const size_t size, char  * arg_c_fitsfiles[])
+{
+   size_t i;
+   for(i=0; i<size;i++)
+   {
+      free(arg_c_fitsfiles[i]);
+   }
+
+   free(arg_c_fitsfiles);
+}
+
+
+#define STACKSTRLEN (MAXPATHLEN + 16 + 512)
+
+
+unsigned long xmergefiles(
+      const vector<string> fitsfiles,
+      const string dimensionality,
+      const string merge_dir,
+      const string result_dir,
+      string& merged_file_pathname)
+{
+   LOG_trace(__func__);
+
+   char ** c_fitsfiles = NULL;
+   c_fitsfiles = malloc_fitsfiles_array(fitsfiles, c_fitsfiles);
+
+   const char * mroot = merge_dir.c_str();
+   const char * mergedpath = result_dir.c_str();
+   const char * prefix = dimensionality.c_str();
+   size_t nfiles = fitsfiles.size();
+
+   struct merge_files pm = {mroot,mergedpath,prefix};
+   const size_t merged_len = PATH_MAX + NAME_MAX;
+   char merged[merged_len]; // FIXME const char* []
+
+   int rc = M4VL_mergefiles(&pm, nfiles, (char**) c_fitsfiles, merged, merged_len);
+   if(rc != 0)
+   {
+      throw runtime_error( error_msg(__FILE__,__LINE__, "M4VL_mergefiles() failed with rc=" + to_string(rc)) );
+   }
+
+   free_fitsfiles_array(fitsfiles.size(), c_fitsfiles);
+
+   merged_file_pathname = merged;
+   unsigned long mergefsize = mergefilesize(merged);
+   return mergefsize;
+}
+
+
+// mergefiles split (enables to call reproject in parallel)
+
+
+void xmergefiles_common_header(
+      const string merge_id,
+      const vector<string> fitsfiles,
+      const string dimensionality,
+      const string merge_dir,
+      const string result_dir)
+{
+   LOG_trace(__func__);
+
+   char ** c_fitsfiles = NULL;
+   malloc_fitsfiles_array(fitsfiles, c_fitsfiles);
+
+   const char * jobid = merge_id.c_str();
+   const char * mroot = merge_dir.c_str();
+   const char * mergedpath = result_dir.c_str();
+   const char * prefix = dimensionality.c_str();
+   unsigned long dim = strtol(prefix, NULL, 10);// FIXME prefix was (mis)used to carry dimensionality of FITS files
+   size_t nfiles = fitsfiles.size();
+
+   struct merge_config mconf;
+   M4VL_merge_config_init(jobid, mroot, mergedpath, dim, &mconf);
+
+   int rc = M4VL_mergefiles_common_header(&mconf, nfiles, (const char**) c_fitsfiles);
+
+   free_fitsfiles_array(fitsfiles.size(), c_fitsfiles);
+
+   if(rc == 0)
+   {
+      LOG_STREAM << "M4VL_mergefiles_common_header() succeeded for job: " + string{jobid} << endl;
+   }
+   else
+   {
+      throw runtime_error(
+            error_msg(__FILE__,__LINE__, "M4VL_mergefiles_common_header() failed for job: "
+               + string{jobid}  + " with rc=" + to_string(rc)) );
+   }
+}
+
+
+void xmergefiles_reproject(
+      const string merge_id,
+      const string fitsfilename,
+      const string dimensionality,
+      const string merge_dir,
+      const string result_dir)
+{
+   LOG_trace(__func__);
+
+   const char * jobid = merge_id.c_str();
+   const char * mroot = merge_dir.c_str();
+   const char * mergedpath = result_dir.c_str();
+   const char * prefix = dimensionality.c_str();
+   unsigned long dim = strtol(prefix, NULL, 10);// FIXME prefix was (mis)used to carry dimensionality of FITS files
+
+   struct merge_config mconf;
+   M4VL_merge_config_init(jobid, mroot, mergedpath, dim, &mconf);
+
+   int rc = M4VL_mergefiles_reproject(&mconf, fitsfilename.c_str());
+   if(rc == 0)
+   {
+      LOG_STREAM << "M4VL_mergefiles_reproject() succeeded for job: " + string{jobid} << endl;
+   }
+   else
+   {
+      throw runtime_error(
+            error_msg(__FILE__,__LINE__, "M4VL_mergefiles_reproject() failed for job: "
+               + string{jobid}  + " with rc=" + to_string(rc)) );
+   }
+}
+
+
+
+unsigned long xmergefiles_add_reprojected(
+      const string merge_id,
+      const string dimensionality,
+      const string merge_dir,
+      const string result_dir,
+      string& merged_file_pathname)
+{
+   LOG_trace(__func__);
+
+   const size_t merged_len = PATH_MAX + NAME_MAX;
+   char merged[merged_len];
+
+   const char * jobid = merge_id.c_str();
+   const char * mroot = merge_dir.c_str();
+   const char * mergedpath = result_dir.c_str();
+   const char * prefix = dimensionality.c_str();
+   unsigned long dim = strtol(prefix, NULL, 10);// FIXME prefix was (mis)used to carry dimensionality of FITS files
+
+   struct merge_config mconf;
+   M4VL_merge_config_init(jobid, mroot, mergedpath, dim, &mconf);
+
+   int rc = M4VL_mergefiles_add_reprojected(&mconf, merged, merged_len);
+   if(rc != 0)
+   {
+      throw runtime_error(
+            error_msg(__FILE__,__LINE__, "M4VL_mergefiles_add_reprojected() failed for job: "
+               + string{jobid}  + " with rc=" + to_string(rc)) );
+   }
+
+   merged_file_pathname = merged;
+   unsigned long mergefsize = mergefilesize(merged);
+   return mergefsize;
+}
+
+
diff --git a/data-access/engine/src/common/src/my_assert.cpp b/data-access/engine/src/common/src/my_assert.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..fb577b35755c2eb9658ba1ae16b02887c9d45cfc
--- /dev/null
+++ b/data-access/engine/src/common/src/my_assert.cpp
@@ -0,0 +1,14 @@
+#ifndef MY_ASSERT_HPP
+#define MY_ASSERT_HPP
+
+#include <string>
+#include <stdexcept>
+
+void my_assert(bool statement, std::string src_filename, int line_no, std::string msg)
+{
+   if(!statement)
+      throw std::runtime_error(std::string{src_filename} + ":" + std::to_string(line_no) + ": " + msg);
+}
+
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/Makefile b/data-access/engine/src/vlkb-obscore/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..63bc195bc03b2fa9f58363b4ee9c92e1da2fbf6f
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/Makefile
@@ -0,0 +1,183 @@
+#================================================================================
+EXEC_NAME=vlkb-obscore
+INST_NAME=test
+DEBUG_LEV=-v1
+INSTALL_DIR=/usr/local
+VERSION ?= $(shell git describe)
+TAR_NAME := `basename $(PWD)`
+#================================================================================
+DEPS_DIR := ../common ../../ext/aria-csv ../../ext/nlohmann-json
+DEPS_INC := $(foreach d, $(DEPS_DIR), $d/include)
+DEPS_LIB := $(foreach d, $(DEPS_DIR), $d/lib)
+#================================================================================
+INC_DIR = $(DEPS_INC) \
+	  /usr/include/cfitsio \
+	  /usr/local/cfitsio/include
+LIB_DIR = $(DEPS_LIB) \
+	  /usr/lib64/ast \
+	  /usr/local/lib \
+	  /usr/local/cfitsio/lib
+#================================================================================
+CC=g++
+CFLAGS_DEBUG   = -g -DFDB_DEBUG
+CFLAGS_RELEASE = -O2 
+FLAGS_COMMON   = -fPIC -Wall -Wextra -Wconversion -fno-common -pthread -DVERSIONSTR='"$(VERSION)"' \
+-DBUILD='"$(shell LANG=us_US date; hostname)"'
+CFLAGS_COMMON   = -c -Wstrict-prototypes $(FLAGS_COMMON)
+CXX_DEBUG_FLAGS   = -g -DVERBOSE_DEBUG -DFDB_DEBUG
+CXX_RELEASE_FLAGS = -O3
+# libpqxx 7.7 needs: /usr/include/pqxx/array.hxx:85:3: note: ‘std::string_view’ is only available from C++17 onwards
+CXX_DEFAULT_FLAGS = -c -std=c++17 $(FLAGS_COMMON)
+LDFLAGS = -Wall -lvlkbcommon -lcfitsio -lpq -lpqxx -lcsv -last -last_grf_2.0 -last_grf_3.2 -last_grf_5.6 -last_grf3d -last_err -lstdc++ -lm  
+INC_PARM=$(foreach d, $(INC_DIR), -I$d)
+LIB_PARM=$(foreach d, $(LIB_DIR), -L$d)
+#================================================================================
+EXT_DIR = ext
+SRC_DIR = src
+OBJ_DIR = obj
+BIN_DIR = bin
+DB_SRC_DIR=$(SRC_DIR)/database
+DB_OBJ_DIR=$(OBJ_DIR)/database
+INC_PARM += -I$(EXT_DIR)/include -I$(SRC_DIR)
+LIB_PARM += -L$(EXT_DIR)/lib
+#================================================================================
+EXECUTABLE = $(BIN_DIR)/$(EXEC_NAME)
+DB_CPP_FILES  = $(wildcard $(SRC_DIR)/database/*.cpp)
+DB_OBJ_FILES  = $(addprefix $(DB_OBJ_DIR)/,$(notdir $(DB_CPP_FILES:.cpp=.o))) 
+CPP_FILES  = $(wildcard $(SRC_DIR)/*.cpp)
+OBJ_FILES  = $(addprefix $(OBJ_DIR)/,$(notdir $(CPP_FILES:.cpp=.o))) 
+#================================================================================
+NPROCS = $(shell grep -c 'processor' /proc/cpuinfo)
+MAKEFLAGS += -j$(NPROCS)
+#================================================================================
+.PHONY: all run debug release clean
+
+all: debug 
+
+.PHONY: run 
+run: debug
+	$(EXECUTABLE) $(INST_NAME) $(DEBUG_LEV)
+
+release: CFLAGS   += $(CFLAGS_RELEASE) $(CFLAGS_COMMON)
+release: CXXFLAGS += $(CXX_RELEASE_FLAGS) $(CXX_DEFAULT_FLAGS)
+release: $(EXECUTABLE)
+
+debug: CFLAGS   += $(CFLAGS_DEBUG) $(CFLAGS_COMMON)
+debug: CXXFLAGS += $(CXX_DEBUG_FLAGS) $(CXX_DEFAULT_FLAGS)
+debug: $(EXECUTABLE)
+
+$(EXECUTABLE): makedir $(OBJ_FILES) $(DB_OBJ_FILES)
+	$(CC) $(DB_OBJ_FILES) $(OBJ_FILES) $(LIB_PARM) $(LDFLAGS) -o $@
+
+$(DB_OBJ_DIR)/%.o: $(DB_SRC_DIR)/%.cpp
+	$(CC) $(CXXFLAGS) $(INC_PARM) -o $@ $<
+
+$(OBJ_DIR)/%.o: $(SRC_DIR)/%.cpp
+	$(CC) $(CXXFLAGS) $(INC_PARM) -o $@ $<
+
+clean:
+	-rm -rf $(OBJ_DIR) $(BIN_DIR) $(EXT_DIR)
+
+.PHONY: echo
+echo:
+	@echo EXECUTABLE:
+	@echo $(EXECUTABLE)
+	@echo CPP FILES:
+	@echo $(CPP_FILES)
+	@echo OBJ_FILES:
+	@echo $(OBJ_FILES)
+	@echo DB_OBJ_FILES:
+	@echo $(DB_OBJ_FILES)
+	@echo INC_PARM
+	@echo $(INC_PARM)
+	@echo LIB_PARM
+	@echo $(LIB_PARM)
+	@echo installedEXE
+	@echo $(INSTALL_DIR)/$(EXEC_NAME)$(SUFFIX)
+
+
+
+# release tar.gz
+
+.PHONY: $(DEPS_DIR)
+$(DEPS_DIR):
+	make -C $@ $(DEPS_TARGET)
+
+.PHONY: deps
+deps : $(DEPS_DIR)
+	mkdir -p $(EXT_DIR)
+	cp -r $(DEPS_INC) $(EXT_DIR)
+	cp -r $(DEPS_LIB) $(EXT_DIR)
+
+.PHONY: deps-clean
+deps-clean : DEPS_TARGET=clean
+deps-clean : $(DEPS_DIR)
+
+.PHONY: makedir
+makedir:
+	-mkdir -p $(OBJ_DIR) $(OBJ_DIR)/database $(BIN_DIR)
+
+.PHONY: tar
+tar: deps
+	-tar -czvf $(TAR_NAME)-$(VERSION).tar.gz --transform="s|^|$(TAR_NAME)-$(VERSION)/|" $(PROTO_DIR) $(SRC_DIR) $(EXT_DIR) Makefile
+
+
+
+# release rpm deb
+
+.PHONY: rpm
+rpm: RPM_ROOT=rpmbuild
+rpm: release
+	mkdir -p $(RPM_ROOT)/{BUILD,BUILDROOT,RPMS,SOURCES,SPECS,SRPMS}
+	cp $(EXEC_NAME).spec $(RPM_ROOT)/SPECS
+	cp bin/$(EXEC_NAME) $(RPM_ROOT)/SOURCES
+	rpmbuild -bb --define "_topdir `pwd`/$(RPM_ROOT)"  --define "_prefix /usr/local" --define "version $(shell git describe | sed -r 's/-/./g')"  $(EXEC_NAME).spec
+	find $(RPM_ROOT)/RPMS/* -name '*.rpm' -print0 | xargs -0 cp -t .
+	rm -fr $(RPM_ROOT)
+
+
+.PHONY: deb
+deb: DEB_ROOT=debbuild
+deb: PREFIX=$(DEB_ROOT)/$(EXEC_NAME)/usr/local
+deb:
+	mkdir -p $(DEB_ROOT)/$(EXEC_NAME)/DEBIAN $(PREFIX)
+	mkdir -p $(PREFIX)/bin $(PREFIX)/etc/$(EXEC_NAME)
+	mkdir -p $(PREFIX)/share/doc/$(EXEC_NAME)
+	mkdir -p $(PREFIX)/share/man/man1
+	sed 's/Version:.*/Version: $(VERSION)/' $(EXEC_NAME).control > $(DEB_ROOT)/$(EXEC_NAME)/DEBIAN/control
+	echo "/usr/local/etc/$(EXEC_NAME)/datasets.conf" > $(DEB_ROOT)/$(EXEC_NAME)/DEBIAN/conffiles
+	cp bin/$(EXEC_NAME) $(PREFIX)/bin
+	cp $(EXEC_NAME).datasets.conf $(PREFIX)/etc/$(EXEC_NAME)/datasets.conf
+	cp $(EXEC_NAME).changelog.Debian $(PREFIX)/share/doc/$(EXEC_NAME)/changelog.Debian
+	cp $(EXEC_NAME).copyright $(PREFIX)/share/doc/$(EXEC_NAME)/copyright
+	cp $(EXEC_NAME).1 $(PREFIX)/share/man/man1/$(EXEC_NAME).1
+	gzip --best -n $(PREFIX)/share/man/man1/$(EXEC_NAME).1
+	gzip --best -n $(PREFIX)/share/doc/$(EXEC_NAME)/changelog.Debian
+	cd $(DEB_ROOT) && dpkg-deb --root-owner-group --build $(EXEC_NAME) && mv $(EXEC_NAME).deb ../$(EXEC_NAME)_$(VERSION).deb && cd -
+	rm -fr $(DEB_ROOT)
+
+
+# gitlab Packages doc: https://docs.gitlab.com/ee/user/packages/generic_packages/
+# make up/download PACK_EXT = rpm | deb
+.PHONY: upload
+upload: PACK_FILE := $(shell ls -t $(EXEC_NAME)*.$(PACK_EXT) | head -1)
+upload: GITLAB_PROJ_ID := 79
+upload: GITLAB_PROJ_NAME := $(shell basename -s .git `git config --get remote.origin.url`)
+upload: VER_MAJOR := $(shell echo $(VERSION) | cut -f1 -d.)
+upload: VER_MINOR := $(shell echo $(VERSION) | cut -f2 -d.)
+upload: PACK_URL := "https://ict.inaf.it/gitlab/api/v4/projects/$(GITLAB_PROJ_ID)/packages/generic/$(GITLAB_PROJ_NAME)/$(VER_MAJOR).$(VER_MINOR)/$(PACK_FILE)"
+upload:
+	curl --header "PRIVATE-TOKEN: glpat-CJZDcks7bYqE__ePn4J6" --upload-file $(PACK_FILE) $(PACK_URL)
+
+
+.PHONY: download
+#download: PACK_FILE := $(EXEC_NAME)-$(shell echo $(VERSION) | sed -r "s/-/./g ")-1.x86_64.rpm
+download: PACK_FILE := $(EXEC_NAME)_$(VERSION).deb
+download: GITLAB_PROJ_ID := 79
+download: GITLAB_PROJ_NAME := $(shell basename -s .git `git config --get remote.origin.url`)
+download: VER_MAJOR := $(shell echo $(VERSION) | cut -f1 -d.)
+download: VER_MINOR := $(shell echo $(VERSION) | cut -f2 -d.)
+download: PACK_URL := "https://ict.inaf.it/gitlab/api/v4/projects/$(GITLAB_PROJ_ID)/packages/generic/$(GITLAB_PROJ_NAME)/$(VER_MAJOR).$(VER_MINOR)/$(PACK_FILE)"
+download:
+	curl -O --header "PRIVATE-TOKEN: glpat-CJZDcks7bYqE__ePn4J6" $(PACK_URL)
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/addcards.cpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/addcards.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..a4e4507a488a58a46bec142b9a84af6a8c165e57
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/addcards.cpp
@@ -0,0 +1,48 @@
+
+#include "addcards.hpp"
+
+#include "extra_cards.hpp"
+#include "database/DbConn.hpp" // FIXME
+#include "fitsfiles.hpp"
+#include "io.hpp"
+
+#include <fstream>
+
+using namespace std;
+
+
+struct cards query_cards(const string db_uri, const string db_schema, unsigned int survey_id)
+{
+   LOG_trace(__func__);
+
+   DbConn db(db_uri, db_schema);
+
+   Survey surv = db.querySurveyAttributes(survey_id);
+
+   struct cards values
+   {
+      surv.restFrequency,
+         to_velocity_unit(surv.velocityFitsUnit)
+   };
+
+   return values;
+}
+
+
+
+void vlkb_addcards_by_surveyid(config conf, unsigned int survey_id, const std::string pathname, unsigned int hdunum)
+{
+   LOG_trace(__func__);
+
+   const bool WITH_PASSWORD = true;
+
+   string db_uri    = conf.getDbUri(WITH_PASSWORD);
+   string db_schema = conf.getDbSchema();
+
+   struct cards card_values = query_cards(db_uri, db_schema, survey_id);
+   vector<struct fits_card> cards = convert_to_cards(card_values);
+
+   fitsfiles::add_cards_if_missing(pathname, hdunum, cards); 
+}
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/addcards.hpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/addcards.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..40ec380dce02d25fb8f949c57910158e4928f1b1
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/addcards.hpp
@@ -0,0 +1,9 @@
+#ifndef ADDCARDS_HPP
+#define ADDCARDS_HPP
+
+#include "config.hpp"
+#include <string>
+
+void vlkb_addcards_by_surveyid(config conf, unsigned int survey_id, const std::string pathname, unsigned int hdunum = 1);
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/extra_cards.cpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/extra_cards.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..9b8109583b4c968a7985ec8669193906546c25be
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/extra_cards.cpp
@@ -0,0 +1,53 @@
+
+#include "extra_cards.hpp"
+#include "cutout.hpp" // struct fits_card needed
+
+#include "io.hpp"
+
+#include <sstream>
+#include <iomanip> // setprecision()
+
+using namespace std;
+
+vector<struct fits_card> convert_to_cards(const struct cards card_values)
+{
+   LOG_trace(__func__);
+
+   const string empty_str;
+   string rest_frq_str;
+
+   // E. W. Greisen et al.: Representations of spectral coordinates in FITS
+   // paperIII section 3.4.4. Coordinate parameters
+
+   // the rest frequency or wavelength of the spectral-feature :
+   // the special floating-valued keywords
+   // RESTFRQa (floating-valued),
+   // RESTWAVa (floating-valued),
+   // Their units are ‘Hz’ and ‘m’
+   //
+   // Keyword RESTFREQ has been used in previous FITS files
+   // and should be recognized as equivalent to RESTFRQ.
+   //
+   // FIXME what is unit of RESTFREQ key ?
+
+   vector<struct fits_card> new_cards;
+
+   if(card_values.cunit3 == velocity_unit::NONE) return new_cards;
+
+   ostringstream ss;
+   ss << std::fixed << std::setprecision(1) << card_values.rest_frq;
+   rest_frq_str = ss.str();
+
+   string cunit3_str = to_string(card_values.cunit3);
+
+   struct fits_card card_restfrq{"RESTFRQ", rest_frq_str, "[Hz] key added by vlkb_cutout"};
+   struct fits_card card_cunit3 {"CUNIT3",  "'" + cunit3_str + "'",   "key added by vlkb_cutout"};
+
+   // FIXME we (mis)use 0.0 as value when DB entry is empty (2D images)
+   // see db-query defult value: row[rest_frequency].as<double>(0.0)
+   if(card_values.rest_frq > 0.0) new_cards.push_back(card_restfrq);
+   if(!cunit3_str.empty()) new_cards.push_back(card_cunit3);
+
+   return new_cards;
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/extra_cards.hpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/extra_cards.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..91f8683c5c648e2881829c1795b282d6e217c068
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/extra_cards.hpp
@@ -0,0 +1,21 @@
+#ifndef EXTRA_CARDS_HPP
+#define EXTRA_CARDS_HPP
+
+#include "parse_surveys_csv.hpp" // velocity_unit and conversions needed
+#include "cutout.hpp" // struct fits_card needed
+
+#include <string>
+#include <vector>
+
+
+struct cards
+{
+   double rest_frq;
+   velocity_unit cunit3;
+};
+
+struct cards query_cards(const std::string db_uri, const std::string db_schema, unsigned int survey_id);
+
+std::vector<struct fits_card> convert_to_cards(const struct cards card_values);
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/service_string.cpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/service_string.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..7feef38f788ea8a6ea2aad14cdb1cbdd36da7fcf
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/service_string.cpp
@@ -0,0 +1,26 @@
+
+#include "service_string.hpp"
+#include <stdexcept>
+
+using namespace std;
+
+skysystem to_skysystem(std::string str)
+{
+   if(str.compare("GALACTIC") == 0) return skysystem::GALACTIC;
+   else if(str.compare("ICRS") == 0) return skysystem::ICRS;
+   else throw invalid_argument("string must be GALACTIC or ICRS but was " + str);
+}
+
+
+
+specsystem to_specsystem(int i)// special case for legacy interface, remove later
+{
+   switch(i)
+   {
+      case 0: return specsystem::NONE;
+      case 1: return specsystem::VELO_LSRK;
+      case 2: return specsystem::WAVE_Barycentric;
+      default: return specsystem::NONE;
+   }
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/service_string.hpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/service_string.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..0b7060524384318ce5b22d5dd9f4ef0dd9c2ea44
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/service_string.hpp
@@ -0,0 +1,14 @@
+#ifndef SERVICE_STRING_HPP
+#define SERVICE_STRING_HPP
+
+#include "cutout.hpp"
+#include "ast4vl.hpp" // uint_bounds needed
+#include <string>
+#include <vector>
+
+
+skysystem to_skysystem(std::string str);
+specsystem to_specsystem(int i);// special case for legacy interface, remove later
+std::string to_cfitsio_format(std::vector<uint_bounds> bounds);
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_ast.cpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_ast.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..3c4687094d2dccefc59c5d355439a6557e64564b
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_ast.cpp
@@ -0,0 +1,176 @@
+
+#include "vlkb_ast.hpp"
+
+#include "ast4vl.hpp" // cout operator needed
+#include "cutout.hpp" // coordinates needed
+#include "cutout_ostream.hpp"
+#include "service_string.hpp"
+#include "fitsfiles.hpp" // header-string needed
+
+#include "io.hpp"
+#include "my_assert.hpp"
+
+#include <iostream>
+#include <climits> // INT_MAX needed
+#include <sstream>
+#include <vector>
+
+
+using namespace std;
+
+
+
+
+//---------------------------------------------------------------------
+// vertices
+//---------------------------------------------------------------------
+
+int vlkb_skyvertices(const string& pathname, const string& skysys_str)
+{
+   LOG_trace(__func__);
+
+   int maxHdu = 1;// FIXMEINT_MAX; // read all HDU's
+
+   std::vector<fitsfiles::Hdu> allHdus = 
+      fitsfiles::fname2hdrstr(pathname, maxHdu);
+
+   for(unsigned int i=0; i<allHdus.size(); i++)
+   {
+      cout << "HDU#" << i << endl;
+
+      fitsfiles::Hdu hd = allHdus.at(i);
+
+      vector<point2d> vertices = calc_skyvertices(hd.m_header, skysys_str);
+
+      for(point2d vertex : vertices) cout << " " << vertex << endl;
+   }
+
+   return 0;
+}
+
+
+
+//---------------------------------------------------------------------
+// bounds
+//---------------------------------------------------------------------
+/*
+const string VELOLSRK{"System=VELO,StdOfRest=LSRK,Unit=km/s"};
+const string WAVEBARY{"System=WAVE,StdOfRest=Bary,Unit=m"};
+*/
+int vlkb_listbounds(const string& skysys_str, const string& specsys_str, const string& pathname)
+{
+   LOG_trace(__func__);
+
+   int maxHdu = 1;//FIXME INT_MAX; // read all HDU's
+
+   std::vector<fitsfiles::Hdu> allHdus = 
+      fitsfiles::fname2hdrstr(pathname, maxHdu);
+
+   for(unsigned int i=0; i<allHdus.size(); i++)
+   {
+      cout << "HDU#" << i << endl;
+
+      fitsfiles::Hdu hd = allHdus.at(i);
+
+      vector<Bounds> bounds_vec = calc_bounds(hd.m_header, skysys_str, specsys_str);
+
+      for(Bounds bnds : bounds_vec) cout << bnds << endl;
+   }
+
+   return 0;
+}
+
+
+
+
+
+
+
+
+
+//---------------------------------------------------------------------
+// overlap with area given in query-string form (name=value&...)
+//---------------------------------------------------------------------
+
+// parse query string to service::coordinates
+
+vector<string> split (const string &s, char delim)
+{
+   vector<string> result;
+   stringstream ss (s);
+   string item;
+
+   while (getline (ss, item, delim)) {
+      result.push_back (item);
+   }
+
+   return result;
+}
+
+coordinates parse_coordinates(const string query_string)
+{
+   LOG_trace(__func__);
+
+   coordinates coord;
+   coord.specsys = specsystem::NONE;
+
+   vector<string> params = split(query_string, '&');
+
+   for(string param : params)
+   {
+      vector<string> pair = split(param, '=');
+
+      my_assert(pair.size() == 2, __FILE__,__LINE__, "one '=' sign expected in '" + param + "'");
+
+      string name  = pair[0];
+      string value = pair[1];
+
+      if(name.empty() || value.empty()) continue;
+
+      /* FIXME what are inited values of coord if param not given in the query string */
+      if(name.compare("skysystem") == 0) coord.skysys = to_skysystem(value);
+      else if(name.compare("l") == 0)    coord.lon_deg = stod(value);
+      else if(name.compare("b") == 0)    coord.lat_deg = stod(value);
+      else if(name.compare("dl") == 0)   {coord.shape = area::RECT; coord.dlon_deg = stod(value);}
+      else if(name.compare("db") == 0)   {coord.shape = area::RECT; coord.dlat_deg = stod(value);}
+      else if(name.compare("r") == 0)    {coord.shape = area::CIRCLE; coord.dlon_deg = coord.dlat_deg = 2.0 * stod(value);}
+      else if(name.compare("specsystem") == 0) coord.specsys = to_specsystem(stoi(value));
+      else if(name.compare("vl") == 0)   coord.vl_kmps = stod(value);
+      else if(name.compare("vu") == 0)   coord.vu_kmps = stod(value);
+   }
+
+   return coord;
+}
+
+
+int vlkb_overlap(const string& pathname, const string& region)
+{
+   LOG_trace(__func__);
+
+   int maxHdu = 1;// INT_MAX; FIXME fitsfiles::header throws error reading behind end-of-file due to INT_MAX 
+
+   std::vector<fitsfiles::Hdu> allHdus = 
+      fitsfiles::fname2hdrstr(pathname, maxHdu);
+
+   //cout << region << endl;
+
+   const coordinates coord = parse_coordinates(region.c_str());
+   //cout << coord << endl;
+
+   int rc = -1; // FIXME only last ov_code will be returned from the cycle
+   for(unsigned int i=0; i<allHdus.size(); i++)
+   {
+      //cout << "HDU#" << i << endl;
+
+      fitsfiles::Hdu hd = allHdus.at(i);
+
+      vector<uint_bounds> bnds = calc_overlap(hd.m_header, coord, rc);
+      cout << to_cfitsio_format(bnds) << endl;
+
+      //for(uint_bounds bnd : bnds) cout << bnd << endl;
+   }
+
+   return rc;
+}
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_ast.hpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_ast.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..9c60e8ff2ad4394ce9ba1a05953ec556a923af6a
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_ast.hpp
@@ -0,0 +1,10 @@
+#ifndef VLKB_AST_H
+#define VLKB_AST_H
+
+#include <string>
+
+int vlkb_skyvertices(const std::string& pathname, const std::string& skysys_str);
+int vlkb_listbounds(const std::string& skysys_str, const std::string& specsys_str, const std::string& pathname);
+int vlkb_overlap(const std::string& pathname, const std::string& region);
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_dropdegen.cpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_dropdegen.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..13624f5f9219f731e002e5b2b2d514d340904ae2
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_dropdegen.cpp
@@ -0,0 +1,153 @@
+
+#include "vlkb_dropdegen.hpp"
+
+#include "fitsfiles.hpp"
+#include "io.hpp"
+
+#include <fitsio.h>
+
+#include <string.h>
+#include <stdio.h>
+
+using namespace std;
+
+void dropdegen(fitsfile *fptr)
+{
+    LOG_trace(__func__);
+
+    FILE* lferr = stderr;
+
+    int status = 0;
+
+    // DBG print key statistics Before eventual modifications
+    int keysexist = 0;
+    int morekeys  = 0;
+    if(fits_get_hdrspace(fptr,&keysexist,&morekeys,&status))
+        fits_report_error(lferr, status);
+//    else
+        // FIXME fprintf(lferr,"BEFORE keysexist:%d morekeys:%d\n",keysexist,morekeys);
+
+    int naxis = 0;
+    if(fits_read_key(fptr,TINT,"NAXIS",&naxis,NULL,&status))
+        fits_report_error(lferr, status);
+//    else
+        //fprintf(lfout, "%s: NAXIS  %d\n",__func__,naxis);
+
+    // Note: string lengths:
+    // char keyname[FLEN_KEYWORD], colname[FLEN_VALUE], coltype[FLEN_VALUE];
+
+    LOG_STREAM << "NAXIS: " << to_string(naxis) << endl;
+
+    int i;
+    int orignaxis = naxis;
+    for (i=0; (i<orignaxis) && (!status); i++) {
+
+        int axislen = 0;
+        char key[FLEN_KEYWORD] = {"\0"};
+        int kix = i+1; // keyindex (starts from 1)
+        sprintf(key,"NAXIS%d",kix);
+        // FIXME fprintf(lfout,"%s: start for %s\n",__func__,key);
+        if(fits_read_key(fptr,TINT,key,&axislen,NULL,&status))
+            fits_report_error(lferr, status);
+//        else
+            // FIXME fprintf(lfout, "%s: %s %d\n",__func__,key,axislen);
+
+        if(axislen != 1)
+            continue;
+
+        // found degen axis NAXISi = 1 -> remove it
+        // FIXME fprintf(lfout,"%s: degen axis %s\n",__func__,key);
+
+        // adjust NAXIS
+        int newvalue = --naxis;
+        // FIXME fprintf(lfout,"%s: new value NAXIS %d\n",__func__,newvalue);
+        if(fits_update_key(fptr,TINT,"NAXIS",&newvalue,NULL,&status))
+            fits_report_error(lferr, status);
+
+        // delete NAXISi ...
+        // FIXME fprintf(lfout,"%s: delete key %s\n",__func__,key);
+        if(fits_delete_key(fptr,key,&status))
+            fits_report_error(lferr, status);
+
+        // ... and all keys which end with kix and are 5 chars: CTYPE CRVAL CRPIX ...
+        // FIXME is this correct ? Other keys longer then 5 chars and
+        // alternative encodings (one letter after axis number)??
+        // How to define what keys to remove ?
+        char keys[FLEN_KEYWORD] = {"\0"};
+        sprintf(keys,"?????%d",kix);
+        // FIXME fprintf(lfout,"%s: delete keys %s ",__func__,keys);
+        while(fits_delete_key(fptr,keys,&status) != KEY_NO_EXIST){
+            ;// FIXME fprintf(lfout,".");
+        }
+        // FIXME fprintf(lfout,"\n");
+        if(status==KEY_NO_EXIST){
+            status = 0; // Reset after expected error in while()
+        } else {
+            fits_report_error(lferr, status);
+        }
+    }
+
+    // DBG print key statistics After eventual modifications
+    if(fits_get_hdrspace(fptr,&keysexist,&morekeys,&status))
+        fits_report_error(lferr, status);
+//    else
+        // FIXME fprintf(lferr,"AFTER  keysexist:%d morekeys:%d\n",keysexist,morekeys);
+}
+
+
+
+
+/*
+ * Filename is <somename>.fits -> try to read it as fits file.
+ */
+int vlkb_dropdegen(const char * fitsfname)
+{
+	LOG_trace(__func__);
+
+	fitsfile *fptr;
+	int status=0;
+	int rc=0;
+	int hdupos;
+
+	int iomode = READWRITE;
+
+	LOG_STREAM << "fits_open_file" << endl;
+
+	if (fits_open_file(&fptr, fitsfname, iomode, &status))
+	{
+		LOG_STREAM << "fitsfname: " << fitsfname << endl;
+		LOG_STREAM << fitsfiles::cfitsio_errmsg(__FILE__,__LINE__,status) << endl;
+		rc = 0;
+		goto f_end;
+	}
+	fits_get_hdu_num(fptr, &hdupos);  /* Get the current HDU position */
+
+	LOG_STREAM << "hdupos: " << to_string(hdupos) << endl;
+
+	for (; !status; hdupos++)  /* Main loop through each HDU/extension */
+	{
+		LOG_STREAM << "hdupos: " << to_string(hdupos) << endl;
+
+		// drop degenerate axis in current HDU
+		dropdegen(fptr);
+
+		// take next HDU
+		fits_movrel_hdu(fptr, 1, NULL, &status);  /* try to move to next HDU */
+	}
+
+	if (status == END_OF_FILE){
+		status = 0; // Reset after expected error in for
+	} else {
+		rc = -1;
+	}
+
+	fits_close_file(fptr, &status);
+
+f_end:
+	//printf("%s: rc = %d\n",__func__,rc);
+	return rc;
+}
+
+
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_dropdegen.hpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_dropdegen.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..9822dec85eaf844811ca1e6abdaebe6809f41d0c
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_dropdegen.hpp
@@ -0,0 +1,19 @@
+#ifndef VLKB_DROPDEGEN_HPP
+#define VLKB_DROPDEGEN_HPP
+
+//#include <fitsio.h>
+
+//#include <stdio.h>
+
+// Drop all degenerated axes from a header of a FITS file.
+// Degenerate axis is such that NAXISi = 1
+// dropdegen() will adjust NAXIS keyword, remove NAXISi keyword(s)
+// and remove all keywords related to the degenerated axis 'i'.
+
+// drop degen in an opened file fptr
+//void dropdegen(fitsfile *fptr);
+
+// drop degen in all HDU of a fits file given by fitsname
+int vlkb_dropdegen(const char * fitsfname);
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mcutout.cpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mcutout.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..4b4e39cc60ffd35daf7c5f98c4a38e3ef894dc6e
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mcutout.cpp
@@ -0,0 +1,35 @@
+
+#include <fstream>
+#include <string>
+
+#include "config.hpp"
+#include "json.hpp"
+#include "mcutout_nljson.hpp"
+#include "mcutout_ostream.hpp"
+#include "mcutout.hpp"
+#include "vlkb_mcutout.hpp"
+
+
+using json = nlohmann::json;
+
+using namespace std;
+
+string vlkb_mcutout(string json_request_filename, config conf)
+{
+   const bool ASSERTS = true;
+
+   // read mcutout json from file
+
+   std::ifstream ifs(json_request_filename);
+   const std::string json_str( (std::istreambuf_iterator<char>(ifs) ),
+         (std::istreambuf_iterator<char>()    ) );
+
+   // do cutouts
+
+   json jcuts = json::parse(json_str, nullptr, ASSERTS);
+   vector<struct cut_param_s> cut_params = jcuts.get<vector<struct cut_param_s>>();
+   struct mcutout_res_s mres = mcutout(cut_params, conf);
+
+   return mres.tgz_filename;
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mcutout.hpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mcutout.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..a96237798a2cfc04d0aca5dd155d5b1f4bb385c9
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mcutout.hpp
@@ -0,0 +1,11 @@
+#ifndef VLKB_MCUTOUT_HPP
+#define VLKB_MCUTOUT_HPP
+
+#include <string>
+
+#include "config.hpp"
+
+std::string vlkb_mcutout(std::string json_request_filename, config conf);
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mergefiles.cpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mergefiles.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..35f72cabf71bbb6c40c96e3ccb5b674ca76a3291
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mergefiles.cpp
@@ -0,0 +1,46 @@
+
+#include "vlkb_mergefiles.hpp"
+
+#include "io.hpp"
+#include "mcutout.hpp"
+#include <sys/param.h> // NAME_MAX PATH_MAX
+
+
+#include <string>
+#include <vector>
+
+#include <string.h>
+
+
+
+
+using namespace std;
+
+
+
+string vlkb_mergefiles(const vector<string> filenames)
+{
+   const size_t fcnt = filenames.size();
+   char ffs[fcnt][PATH_MAX+NAME_MAX];
+
+   size_t i = 0;
+   for(string filename : filenames)
+      strcpy(ffs[i++], filename.c_str());
+/* 
+   char * fitsfs[fcnt];
+   for(i=0;i<fcnt;i++) fitsfs[i] = ffs[i];
+
+   struct merge_files mf = {"/tmp",".","X"};// = mroot mresdir prefix
+   char m_result[PATH_MAX+NAME_MAX];
+
+  
+   int rc = M4VL_mergefiles(&mf, fcnt, fitsfs, m_result, PATH_MAX+NAME_MAX);
+   if(rc != 0) cerr << "M4VL_mergefiles returned rc: " << to_string(rc) << endl;
+
+   return string(m_result);
+*/
+   return string("FIXME was implemented with M4VL_mergefiles which is common-internal. Re-implement with xmergefiles which is commp-API.");
+}
+
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mergefiles.hpp b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mergefiles.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..d31b5abc8f077220cebd5bcc5a313d47485d7297
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/Obsolete/vlkb_mergefiles.hpp
@@ -0,0 +1,9 @@
+#ifndef VLKB_MERGEFILES_HPP
+#define VLKB_MERGEFILES_HPP
+
+#include <string>
+#include <vector>
+
+std::string vlkb_mergefiles(const std::vector<std::string> filenames);
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/config.cpp b/data-access/engine/src/vlkb-obscore/src/config.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..7f6b41778a69436ae1e2c6e71a18b1407c960435
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/config.cpp
@@ -0,0 +1,76 @@
+
+#include "io.hpp"
+#include "config.hpp"
+
+#include <iostream>
+#include <fstream>
+#include <sstream>
+#include <map>
+
+/*/ C
+#include <stdio.h>
+#include <stdlib.h> // atoi needed
+#include <string.h>
+*/
+
+using namespace std;
+
+void config::read_config(const std::string & settings_path)
+{
+   std::ifstream settings_file(settings_path);
+   std::string line;
+
+   LOG_STREAM << "config::read_config()" << endl;
+
+   if (settings_file.fail())
+   {
+      LOG_STREAM << "config file does not exist. Default options used." << endl;
+
+      return;
+   }
+
+
+   while (std::getline(settings_file, line))
+   {
+      std::istringstream iss(line);
+      std::string id, eq, val;
+
+      if (std::getline(iss, id, '='))
+      {
+         if (std::getline(iss, val))
+         {
+            if (m_settings.find(id) != m_settings.end())
+            {
+               if (val.empty())
+               {
+                  LOG_STREAM << "config " << id.c_str()
+                     << " is empty. Keeping default " << m_settings[id].c_str() << endl;
+               }
+               else
+               {
+                  m_settings[id] = val;
+                  LOG_STREAM << "config " << id.c_str()
+                     <<" read as " << m_settings[id].c_str() << endl;
+               }
+            }
+            else
+            {
+               //Not present in map
+               LOG_STREAM << "Setting "<< id.c_str() << " not defined, ignoring it" << endl;
+               continue;
+            }
+         }
+         else
+         {
+            // Comment line, skiping it
+            continue;
+         }
+      }
+      else
+      {
+         //Empty line, skipping it
+         continue;
+      }
+   }
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/config.hpp b/data-access/engine/src/vlkb-obscore/src/config.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..40fe226c0b8eb1cabdfffb25b4d61522eb09b4cf
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/config.hpp
@@ -0,0 +1,68 @@
+
+#ifndef CONFIG_HPP
+#define CONFIG_HPP
+
+#include <string>
+#include <map>
+
+
+class config
+{
+   public:
+
+      void read_config(const std::string & settings_path);
+
+      std::string getFitsDir() const       {return m_settings.at(fits_dir);}
+
+      std::string getDbUri(bool with_password = false) const {return m_settings.at(pg_uri);}
+      std::string getDbSchema() const   {return m_settings.at(pg_schema);}
+
+      std::string getObsCorePublisher() const {return m_settings.at(obscore_publisher);}
+      std::string getObscoreAccessUrl() const {return m_settings.at(obscore_access_url);}
+      std::string getObsCoreAccessFormat() const {return m_settings.at(obscore_access_format);}
+
+      std::string getLogDir() const      {return m_settings.at(log_dir);}
+      std::string getLogFileName() const {return m_settings.at(log_filename);}
+
+
+   private:
+      std::string value(std::string key) {return m_settings.at(key);}
+
+      const std::string fits_dir{"fits_path_surveys"};
+
+      const std::string pg_uri{"pg_uri"};
+      const std::string pg_schema{"pg_schema"};
+
+      const std::string obscore_publisher{"obscore_publisher"};
+      const std::string obscore_access_url{"obscore_access_url"};
+      const std::string obscore_access_format{"obscore_access_format"};
+
+      const std::string log_dir{"log_dir"};
+      const std::string log_filename{"log_filename"};
+
+      //-------------------------------------------------
+      // defaults
+      //-------------------------------------------------
+
+      const std::string empty_string;
+
+      std::map<const std::string, std::string> m_settings 
+      {
+         {fits_dir, "/srv/surveys"},
+
+            {pg_uri,    empty_string},
+            {pg_schema, empty_string},
+
+            {obscore_publisher,     empty_string},
+            {obscore_access_url,    empty_string},
+            {obscore_access_format, "application/fits"},
+
+            {log_dir,      "/tmp"},
+            {log_filename, "vlkb-obscore.log"},
+
+      };
+};
+
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database.hpp b/data-access/engine/src/vlkb-obscore/src/database.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..d7a5f8e772e9ccde85151680156099061f3cbd27
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database.hpp
@@ -0,0 +1,34 @@
+#ifndef DATABASE_HPP
+#define DATABASE_HPP
+
+#include <vector>
+#include <string>
+
+std::string createPubdid(const std::string path, const std::string filename, unsigned int hdunum);
+
+namespace database
+{
+   void dbInit(const std::string db_uri, const std::string db_schema, const std::string surveys_file);
+
+   void dbAddSurvey(int sid, const std::string groups,
+         const std::string obscore_publisher,
+         const std::string obscore_access_format,
+         const std::string remote_fitsdir,
+         const std::string db_uri, const std::string db_schema,
+         const std::string fitsdir, int max_hdupos);
+
+   void dbModifyGroups(int sid, const std::string groups,
+         const std::string obscore_publisher,
+         const std::string db_uri, const std::string db_schema);
+
+   void dbRemoveSurvey(int sid, const std::string db_uri, const std::string db_schema);
+   void dbSurveyBounds(/*int sid,*/ const std::string db_uri, const std::string db_schema);
+   std::vector<std::string> dbCheck(const std::string db_uri, const std::string db_schema);
+   std::vector<std::string> dbListSurveys(const std::string db_uri, const std::string db_schema);
+   std::string dbListFiles(int sid, const std::string db_uri, const std::string db_schema, const std::string fitsdir);
+   void dbGenCards(int sid, const std::string db_uri, const std::string db_schema,
+         const std::string fitsdir, unsigned int max_hdunum);
+}
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/DbConn.cpp b/data-access/engine/src/vlkb-obscore/src/database/DbConn.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..730ddb5c2ff483bded15176744b185b3e0307bb8
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/DbConn.cpp
@@ -0,0 +1,483 @@
+
+#include <iostream>
+#include <iomanip>
+#include <string>
+#include <pqxx/pqxx>
+
+#include <libgen.h> // basename() needed$
+
+#include "DbConn.hpp"
+
+#include "my_assert.hpp"
+#include "io.hpp"
+
+
+using namespace std;
+
+/*   DbConn::DbConn
+(std::string dbms,
+ std::string host_name,
+ std::string port,
+ std::string schema,
+ std::string db_name,
+ std::string user_name,
+ std::string password
+ ):m_schema(schema)
+   ,m_uri(dbms + "://" + user_name + ":"+ password + "@" + host_name + ":" + port + "/" + db_name)
+   ,m_conn(m_uri)
+{
+   LOG_trace(__func__);
+   LOG_STREAM << "DB[" << m_schema << "]: " << m_uri << endl;
+
+   // FIXME port must be uint16_t
+   // FIXME add check: dbms must be postgresql otherwise error
+};
+*/
+
+   DbConn::DbConn(std::string db_uri, std::string schema):m_schema(schema)
+                                                          ,m_uri(db_uri)
+                                                          ,m_conn(m_uri)
+{
+   LOG_trace(__func__);
+   LOG_STREAM << "DB[" << m_schema << "]: " << m_uri << endl;
+};
+
+
+// VLKB tables
+
+// FIXME table to be created under Schema m_schema
+// "SELECT current_schema();"
+// PostgreSQL will create tables into 'current schema':
+// current schema is the first schema mentioned on the search_path:
+// SHOW search_path();
+// SELECT current_schema();
+
+
+
+void DbConn::dbExecCmds(const vector<string> sqlCmds)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   try
+   {
+      for(string cmd : sqlCmds)
+         txn.exec(cmd);
+      txn.commit();
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << "SQL error: " << e.what()  << endl;
+      LOG_STREAM << "Query was: " << e.query() << endl;
+      throw;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << "Error: " << e.what() << endl;
+      throw;
+   }
+};
+
+
+
+pqxx::result DbConn::dbExecQuery3(string query)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   try
+   {
+      pqxx::result r{txn.exec(query)};
+      return r;
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << "SQL error: " << e.what()  << endl;
+      LOG_STREAM << "Query was: " << e.query() << endl;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << "Error: " << e.what() << endl;
+   }
+   return pqxx::result{};
+};
+
+
+
+void DbConn::dbExecQuery(string query)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   try
+   {
+      pqxx::result r{txn.exec(query)};
+
+      // column names
+      for(/*pqxx::row::size_type*/size_t i=0; i<r.columns();i++) LOG_STREAM << r.column_name(i) << "\t";
+      LOG_STREAM << endl;
+
+
+      // Results can be accessed and iterated again.  Even after the connection
+      // has been closed.
+      for (auto row: r)
+      {
+         // Iterate over fields in a row.
+         for (auto field: row) LOG_STREAM << field.c_str() << " ";
+         LOG_STREAM << endl;
+      }
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << "SQL error: " << e.what()  << endl;
+      LOG_STREAM << "Query was: " << e.query() << endl;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << "Error: " << e.what() << endl;
+   }
+};
+
+
+vector<string> DbConn::dbExecQuery2(string query)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   vector<string> rr;
+
+   try
+   {
+      pqxx::result r{txn.exec(query)};
+
+      // column names
+      string columns;
+      for(/*pqxx::row::size_type*/size_t i=0; i<r.columns();i++)
+      {
+         columns += string{r.column_name(i)} + " | ";
+      }
+      rr.push_back(columns);
+
+      // Results can be accessed and iterated again.  Even after the connection
+      // has been closed.
+      for (auto row: r)
+      {
+         // Iterate over fields in a row.
+         string one_row;
+         for (auto field: row) one_row += to_string(field) + " | ";
+         rr.push_back(one_row);
+      }
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << "SQL error: " << e.what()  << endl;
+      LOG_STREAM << "Query was: " << e.query() << endl;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << "Error: " << e.what() << endl;
+   }
+
+   return rr;
+};
+
+
+
+
+
+//-----------------------------------------//
+// queries with specific results format    //
+//-----------------------------------------//
+
+
+Survey DbConn::querySurveyAttributes(int sid)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   Survey surv("surveys");
+
+   std::string qstr(surv.qry_SELECT_Attribs(sid));
+
+//   try
+//   {
+      pqxx::result r{txn.exec(qstr)};
+      my_assert(1==r.size(), __FILE__, __LINE__, "survey SID[" + to_string(sid) + "] found " + to_string(r.size()) + " times");
+      surv.handleResult(r);
+/*   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << "SQL error: " << e.what()  << endl;
+      LOG_STREAM << "Query was: " << e.query() << endl;
+      return surv;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << "Error: " << e.what() << endl;
+      return surv;
+   }
+*/
+   return surv;
+};
+
+
+
+
+// used in adCheck()
+std::vector<std::string> DbConn::queryTableNames(void)
+{
+   LOG_trace(__func__);
+
+   std::vector<std::string> tnames;
+
+   pqxx::work txn{m_conn};
+
+   std::string qstr(
+         "SELECT t.table_name "
+         "FROM information_schema.tables t "
+         "WHERE t.table_schema = '" + m_schema + "'"
+         "AND t.table_type = 'BASE TABLE' "
+         "ORDER BY t.table_name");
+
+   try
+   {
+      pqxx::result r{txn.exec(qstr)};
+
+      // Results can be accessed and iterated again.  Even after the connection
+      // has been closed.
+      for (auto row: r)
+      {
+         // Iterate over fields in a row.
+         for (auto field: row)
+         {
+            tnames.push_back(field.c_str());
+         }
+      }
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << "SQL error: " << e.what()  << endl;
+      LOG_STREAM << "Query was: " << e.query() << endl;
+      return tnames;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << "Error: " << e.what() << endl;
+      return tnames;
+   }
+   return tnames;
+};
+
+
+
+std::string DbConn::queryPubdid(int hid)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   // FIXME consider schema to go to search_path setting and avoid it to be in each query
+   std::string qstr("SELECT pubdid,header_id FROM " + m_schema + ".headers WHERE header_id = "
+         + std::to_string(hid));
+
+   std::string pubdid;
+
+   try
+   {
+      pqxx::result r{txn.exec(qstr)};
+
+      // Results can be accessed and iterated again.  Even after the connection
+      // has been closed.
+      for (auto row: r)
+      {
+         LOG_STREAM << "Row: ";
+         //         strcpy(pubdid, row["pubdid"].c_str());
+         pubdid = row["pubdid"].as<std::string>();
+         // Iterate over fields in a row.
+         for (auto field: row) LOG_STREAM << field.c_str() << " ";
+         LOG_STREAM << endl;
+      }
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << "SQL error: " << e.what()  << endl;
+      LOG_STREAM << "Query was: " << e.query() << endl;
+      return pubdid;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << "Error: " << e.what() << endl;
+      return pubdid;
+   }
+
+   return pubdid;
+};
+
+int DbConn::queryMaxHid(void)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   std::string qstr("SELECT coalesce(max(header_id), 0) AS max FROM headers");
+
+   LOG_STREAM << qstr << endl;
+
+   int max_hid = -1; // coalesce(): if no value yet -> replaces NULL with 0 (so app starts to count from 1)
+
+   try
+   {
+      pqxx::result r{txn.exec(qstr)};
+
+      // should cycle ones only: SQL-MAX() returns only one number (even if it would occur in more rows)/ from PSQL doc
+      for (auto row: r)
+      {
+         max_hid = row["max"].as<int>();
+      }
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << "SQL error: " << e.what()  << endl;
+      LOG_STREAM << "Query was: " << e.query() << endl;
+      throw;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << "Error: " << e.what() << endl;
+      throw;
+   }
+
+   return max_hid;
+};
+
+
+cut_db_params_t DbConn::queryCutParams(std::string pubdid)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   // FIXME schema to search_path setting
+   std::string qstr("SELECT pubdid,filename,hdunum,rest_frequency,storage_path,header,velocity_fits_unit FROM  "
+      + m_schema + ".headers INNER JOIN "
+      + m_schema + ".surveys ON "+m_schema+".headers.survey_id = "+m_schema+".surveys.survey_id WHERE pubdid = '"+ pubdid + "'");
+
+   cut_db_params_t cut_params;
+
+   LOG_STREAM << qstr << endl;
+
+   try
+   {
+      pqxx::result r{txn.exec(qstr)};
+
+      // Results can be accessed and iterated again.
+      // Even after the connection has been closed.
+
+      // FIXME assert(rowcount == 1): exactly one row returned 
+
+      LOG_STREAM << "RowsCount: " << r.size() << endl;
+      for (auto row: r)
+      {
+         cut_params.header        = row["header"].as<std::string>();
+         cut_params.filename      = row["filename"].as<std::string>();
+         cut_params.hdunum        = row["hdunum"].as<unsigned int>();
+         cut_params.storage_path  = row["storage_path"].as<std::string>();
+
+         const double default_when_null = 0.0;
+         cut_params.rest_frequency = row["rest_frequency"].as<double>(default_when_null);
+         const string empty_str;
+         cut_params.velocity_fits_unit = row["velocity_fits_unit"].as<std::string>(empty_str);
+
+         // Iterate over fields in a row.
+         //for (auto field: row) LOG_STREAM << field.c_str() << " "; LOG_STREAM << endl;
+      }
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      LOG_STREAM << __func__ << ": SQL error: " << e.what()  << endl;
+      LOG_STREAM << __func__ << ": Query was: " << e.query() << endl;
+      //return cut_params;
+   }
+   catch (std::exception const &e)
+   {
+      LOG_STREAM << __func__ << ": Error: " << e.what() << endl;
+      //return cut_params;
+   }
+
+   return cut_params;
+};
+
+
+
+cut_db_params_t DbConn::BqueryCutParams(std::string pubdid)
+{
+   LOG_trace(__func__);
+
+   pqxx::work txn{m_conn};
+
+   /*/ FIXME schema to search_path setting
+   std::string qstr("SELECT pubdid,filename,hdunum,rest_frequency,storage_path,header,velocity_fits_unit FROM  "
+      + m_schema + ".headers INNER JOIN "
+      + m_schema + ".surveys ON "+m_schema+".headers.survey_id = "+m_schema+".surveys.survey_id WHERE pubdid = '"+ pubdid + "'");
+*/
+   std::string qstr("SELECT access_url,rest_frequency,storage_path,velocity_fits_unit,name,species,transition FROM  "
+      +m_schema+".obscore INNER JOIN "+ m_schema + ".surveys ON "
+      +m_schema+".obscore.obs_collection = CONCAT("
+         +m_schema+".surveys.name, ' ',"
+         +m_schema+".surveys.species, ' ',"
+         +m_schema+".surveys.transition)"
+      +" WHERE obs_publisher_did LIKE '%" + pubdid + "'"); // FIXME replace LIKE with exact match but concat ivoprefix+pubdid
+
+   cut_db_params_t cut_params;
+
+   LOG_STREAM << qstr << endl;
+
+   try
+   {
+      pqxx::result r{txn.exec(qstr)};
+
+      // Results can be accessed and iterated again.
+      // Even after the connection has been closed.
+
+      string aa(10*80,'a');
+      long int row_count = r.size();// FIXME check type
+      LOG_STREAM << "RowsCount: " << row_count << endl;
+
+      if(row_count != 1)
+         throw invalid_argument("found " + to_string(row_count) +  " rows instead of one for pubdid: " + pubdid);
+
+      for (auto row: r)
+      {
+         cut_params.header        = aa;//row["header"].as<std::string>();
+         cut_params.filename      = row["access_url"].as<std::string>();
+         cut_params.hdunum        = 1;//row["hdunum"].as<unsigned int>();
+         cut_params.storage_path  = row["storage_path"].as<std::string>();
+
+         const double default_when_null = 0.0;
+         cut_params.rest_frequency = row["rest_frequency"].as<double>(default_when_null);
+         const string empty_str;
+         cut_params.velocity_fits_unit = row["velocity_fits_unit"].as<std::string>(empty_str);
+
+         cut_params.name = row["name"].as<std::string>(empty_str);
+         cut_params.species = row["species"].as<std::string>(empty_str);
+         cut_params.transition = row["transition"].as<std::string>(empty_str);
+
+         // Iterate over fields in a row.
+         //for (auto field: row) LOG_STREAM << field.c_str() << " "; LOG_STREAM << endl;
+      }
+   }
+   catch (pqxx::sql_error const &e)
+   {
+      throw runtime_error("SQL error: " + string{e.what()} + " Query: " + string{e.query()});
+   }
+
+   return cut_params;
+};
+
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/DbConn.hpp b/data-access/engine/src/vlkb-obscore/src/database/DbConn.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..09af95e212c1b900436aec52ebb99ba6c0ae1ab5
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/DbConn.hpp
@@ -0,0 +1,67 @@
+#ifndef DBCONN_HPP
+#define DBCONN_HPP
+
+#include <string>
+#include <pqxx/pqxx>
+
+#include "SqlSurvey.hpp"
+
+
+
+typedef struct cut_db_params
+{
+   std::string header;
+   std::string filename;
+   unsigned int hdunum;
+   double rest_frequency;
+   std::string velocity_fits_unit;
+   std::string storage_path;
+   std::string name;
+   std::string species;
+   std::string transition;
+} cut_db_params_t;
+
+
+
+
+class DbConn
+{
+   public:
+
+/*      DbConn
+         (std::string dbms,
+          std::string host_name,
+          std::string port,
+          std::string schema,
+          std::string db_name,
+          std::string user_name,
+          std::string password
+         );
+*/
+      DbConn(std::string db_uri, std::string schema);
+
+
+      cut_db_params_t queryCutParams(std::string pubdid);
+      cut_db_params_t BqueryCutParams(std::string pubdid);
+      std::string queryPubdid(int hid);
+      int queryMaxHid(void);
+
+      // utils
+      void dbExecCmds(const std::vector<std::string> sqlCmds);
+      void dbExecQuery(std::string query);
+      pqxx::result dbExecQuery3(std::string query);
+      std::vector<std::string> dbExecQuery2(std::string query);
+      std::vector<std::string> queryTableNames(void);
+      Survey querySurveyAttributes(int sid);
+
+
+   private:
+
+      std::string m_schema;
+      std::string m_uri;
+
+      pqxx::connection m_conn;
+};
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/DbConn_ostream.cpp b/data-access/engine/src/vlkb-obscore/src/database/DbConn_ostream.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..0839e74a2913aabfad8c507cfb5677ee4d32e0f6
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/DbConn_ostream.cpp
@@ -0,0 +1,21 @@
+
+#include "DbConn.hpp"
+#include "DbConn_ostream.hpp"
+
+#include <iostream>
+#include <iomanip>
+
+using namespace std;
+
+std::ostream& operator<<( std::ostream &out, cut_db_params_t const& p)
+{
+	out << "h: " << p.header.substr(80,80)
+      << " f: " << p.filename
+      << "[" << to_string(p.hdunum)
+      << "]  f: " << to_string(p.rest_frequency)
+      << " v: " << p.velocity_fits_unit
+      << " p: " << p.storage_path;
+	return out;
+}
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/DbConn_ostream.hpp b/data-access/engine/src/vlkb-obscore/src/database/DbConn_ostream.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..cbf40ed3d6e41e18b7515165e2643af5c9d0f60f
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/DbConn_ostream.hpp
@@ -0,0 +1,10 @@
+#ifndef DBCONN_OSTREAM_HPP
+#define DBCONN_OSTREAM_HPP
+
+#include "DbConn.hpp"
+#include <iostream>
+
+std::ostream& operator<<( std::ostream &out, cut_db_params_t const& p);
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/ObsCoreKeys.hpp b/data-access/engine/src/vlkb-obscore/src/database/ObsCoreKeys.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..874291ca4de51eb2a02a2f7535f2a6a033bec2f1
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/ObsCoreKeys.hpp
@@ -0,0 +1,37 @@
+#ifndef OBSCOREKEYS_HPP
+#define OBSCOREKEYS_HPP
+
+#include <string>
+#include <set>
+
+
+namespace ObsCoreKeys
+{
+   const std::set<std::string> strKeys
+      = {
+         "OBJECT",    
+         "DATE-OBS",  
+         "DATE-END",  
+         "CUNIT1",  
+         "CUNIT2",  
+         "CUNIT3"
+      };
+   const std::set<std::string> uintKeys
+      = {
+         "NAXIS1",
+         "NAXIS2",
+         "NAXIS3"
+      };
+   const std::set<std::string> doubleKeys
+      = {
+         "CDELT1",            
+         "CDELT2",
+         "CDELT3"
+      };
+
+   inline std::set<std::string> add_str_keys(const std::set<std::string> strSet1, std::set<std::string> strSet2)
+            { std::set<std::string> str_set{strSet1}; str_set.insert(strSet2.begin(),strSet2.end()); return str_set;}
+};
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/ObsCoreTime.cpp b/data-access/engine/src/vlkb-obscore/src/database/ObsCoreTime.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..8298a8be5a21a7f353acf818aab738493f03923e
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/ObsCoreTime.cpp
@@ -0,0 +1,125 @@
+
+#include "ObsCoreTime.hpp"
+
+#include "io.hpp"
+
+#include <cmath> // isnan() needed
+
+using namespace std;
+
+
+const std::string SqlNull = "NULL";
+const unsigned long SecsInDay = 86400;
+
+//----------------------------------------------------------------
+// ISO8601 -> Modified Julian Date
+//----------------------------------------------------------------
+
+double fitsdate2MJD(const string fitsdate)
+{
+   LOG_trace(__func__);
+
+   int y,M,d,h,m;
+   float s;
+   sscanf(fitsdate.c_str(), "%d-%d-%dT%d:%d:%fZ", &y, &M, &d, &h, &m, &s);
+
+   LOG_STREAM<<"DBGTIME: "<<fitsdate<< " : " << y << " " << M << " " << d << " t " << h << " " << m << " " << s << endl;
+
+   unsigned long jdn = (1461 * (y + 4800 + (M-14)/12))/4 + \
+                       (367  * (M - 2 - 12*((M-14)/12)))/12 - \
+                       (3    * ((y + 4900 + (M - 14)/12)/100))/4 + d - 32075 ;
+
+   double mjd = (double)jdn;
+
+   double ttt = (h-12)/24.0 + m/1440.0 + s/86400.0;
+
+   mjd += ttt;
+
+   LOG_STREAM << "DBGTIME MJD: " << mjd << endl;
+
+   return mjd;
+}
+
+
+
+//----------------------------------------------------------------
+// FIXME if use map<> and returns strings:
+// also Card-not-present case handled by setting SqlNull in ObsCore
+//----------------------------------------------------------------
+obscore::ObsCoreTime obscore::calcObsCoreTime(std::string DATEOBS, std::string DATEEND)
+{
+   struct obscore::ObsCoreTime ocTime;
+
+   if(!DATEOBS.empty())
+   {
+      ocTime.t_min = fitsdate2MJD(DATEOBS.c_str());
+      if(std::isnan(ocTime.t_min))
+      {
+         LOG_STREAM << "DBGTIME t_min in NaN : " << ocTime.t_min << endl;
+         ocTime.t_min_str = SqlNull;
+      }
+      else if (ocTime.t_min < 0.0)
+      {
+         LOG_STREAM << "DBGTIME t_min is negative : " << ocTime.t_min << endl;
+         ocTime.t_min_str = SqlNull;
+      }
+      else
+      {
+         ocTime.t_min_str = to_string(ocTime.t_min);
+      }
+   }
+   else
+   {
+      ocTime.t_min_str = SqlNull;
+   }
+
+   if(!DATEEND.empty())
+   {
+      ocTime.t_max = fitsdate2MJD(DATEEND.c_str());
+      if(std::isnan(ocTime.t_max))
+      {
+         LOG_STREAM << "DBGTIME t_max in NaN : " << ocTime.t_max << endl;
+         ocTime.t_max_str = SqlNull;
+      }
+      else if (ocTime.t_max < 0.0)
+      {
+         LOG_STREAM << "DBGTIME t_max is negative : " << ocTime.t_max << endl;
+         ocTime.t_max_str = SqlNull;
+      }
+      else
+      {
+         ocTime.t_max_str = to_string(ocTime.t_max);
+      }
+   }
+   else
+   {
+      ocTime.t_max_str = SqlNull;
+   }
+
+   if( 0)//t_min_str.compare(SqlNull) && t_max_str.compare(SqlNull) )
+   {
+      double dSecsInDay = double(SecsInDay);
+      ocTime.t_exptime = (ocTime.t_max - ocTime.t_min) * dSecsInDay; // [sec]
+      ocTime.t_exptime_str = to_string(ocTime.t_exptime);
+   }
+   else
+   {
+      ocTime.t_exptime_str = SqlNull;
+   }
+
+   // FIXME not implemented: need to define on how to calc
+
+   ocTime.t_resolution_str  = SqlNull;
+   ocTime.t_xel_str         = SqlNull;
+   /*
+      DEBUG_STREAM<<"DBGTIME: "
+      << ocTime.t_min
+      << " "
+      << ocTime.t_max
+      << " -> "
+      << ocTime.t_exptime
+      << endl;
+      */
+   return ocTime;
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/ObsCoreTime.hpp b/data-access/engine/src/vlkb-obscore/src/database/ObsCoreTime.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..b9a47350ab0972bb999184172f194f8b7f76fc25
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/ObsCoreTime.hpp
@@ -0,0 +1,30 @@
+
+#ifndef OBSCORETIME_HPP
+#define OBSCORETIME_HPP
+
+#include <string>
+
+namespace obscore
+{
+
+
+struct ObsCoreTime
+{
+      double t_min;
+      std::string t_min_str;
+      double t_max;
+      std::string t_max_str;
+      double t_exptime;
+      std::string t_exptime_str;
+      double t_resolution;
+      std::string t_resolution_str;
+      unsigned long int t_xel;
+      std::string t_xel_str;
+};
+
+ObsCoreTime calcObsCoreTime(std::string DATEOBS, std::string DATEEND);
+
+}
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSchema.cpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..1c88b937dee7c7147198d461e89c3fc1a9237520
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema.cpp
@@ -0,0 +1,204 @@
+
+#include "SqlSchema.hpp"
+
+
+using namespace std;
+
+SqlSchema::SqlSchema()
+{
+   reset();
+}
+
+
+void SqlSchema::reset(void)
+{
+   headersRow
+      = {
+         {headersColId::filename,   "filename   VARCHAR(4096)   NOT NULL"},
+         {headersColId::hdunum,     "hdunum     INTEGER         NOT NULL"},
+         {headersColId::pubdid,     "pubdid     VARCHAR(4096)   UNIQUE"},
+         {headersColId::header,     "header     TEXT            NOT NULL"},
+         {headersColId::survey_id,  "survey_id  INTEGER"},
+         {headersColId::header_id,  "header_id  INTEGER         PRIMARY KEY"}
+      };
+
+
+   boundsgalRow
+      = {
+         {bgheader_id,  "header_id  INTEGER       PRIMARY KEY"},
+         {lfrom,   "lon_from  DECIMAL(21,16) NOT NULL" },
+         {lto,     "lon_to    DECIMAL(21,16) NOT NULL" },
+         {lunit,   "lon_unit  VARCHAR(16)" },
+
+         {bfrom,   "lat_from  DECIMAL(21,16) NOT NULL" },
+         {bto,     "lat_to    DECIMAL(21,16) NOT NULL" },
+         {bunit,   "lat_unit  VARCHAR(16)" },
+
+         {vfrom,   "vel_from  DECIMAL(21,8)" },// FIXME check why 8 not 16
+         {vto,     "vel_to    DECIMAL(21,8)" },
+         {vunit,   "vel_unit  VARCHAR(16)" },
+      };
+
+   boundsicrsRow
+      = {
+         {biheader_id,  "header_id  INTEGER       PRIMARY KEY"},
+         {rfrom,   "ra_from  DECIMAL(21,16) NOT NULL" },
+         {rto,     "ra_to    DECIMAL(21,16) NOT NULL" },
+         {runit,   "ra_unit  VARCHAR(16)" },
+
+         {dfrom,   "dec_from  DECIMAL(21,16) NOT NULL" },
+         {dto,     "dec_to    DECIMAL(21,16) NOT NULL" },
+         {dunit,   "dec_unit  VARCHAR(16)" },
+      };
+
+   verticesgalRow
+      = {
+         {vgheader_id,  "header_id  INTEGER        PRIMARY KEY"},
+         {p1lon, "p1_lon DECIMAL(21,16) NOT NULL"},
+         {p1lat, "p1_lat DECIMAL(21,16) NOT NULL"},
+         {p2lon, "p2_lon DECIMAL(21,16) NOT NULL"},
+         {p2lat, "p2_lat DECIMAL(21,16) NOT NULL"},
+         {p3lon, "p3_lon DECIMAL(21,16) NOT NULL"},
+         {p3lat, "p3_lat DECIMAL(21,16) NOT NULL"},
+         {p4lon, "p4_lon DECIMAL(21,16) NOT NULL"},
+         {p4lat, "p4_lat DECIMAL(21,16) NOT NULL"}
+      };
+
+   verticesicrsRow
+      = {
+         {viheader_id,  "header_id  INTEGER           PRIMARY KEY"},
+         {p1ra,  "p1_ra  DECIMAL(21,16) NOT NULL"},
+         {p1dec, "p1_dec DECIMAL(21,16) NOT NULL"},
+         {p2ra,  "p2_ra  DECIMAL(21,16) NOT NULL"},
+         {p2dec, "p2_dec DECIMAL(21,16) NOT NULL"},
+         {p3ra,  "p3_ra  DECIMAL(21,16) NOT NULL"},
+         {p3dec, "p3_dec DECIMAL(21,16) NOT NULL"},
+         {p4ra,  "p4_ra  DECIMAL(21,16) NOT NULL"},
+         {p4dec, "p4_dec DECIMAL(21,16) NOT NULL"}
+      };
+
+
+
+   obscoreRow
+      = {
+         {dataproduct_type, "dataproduct_type   VARCHAR"},
+         {calib_level,      "calib_level        INTEGER     NOT NULL"},
+         {obs_collection,   "obs_collection     VARCHAR     NOT NULL"},
+         {obs_id,           "obs_id             VARCHAR     NOT NULL"},
+         {obs_publisher_id, "obs_publisher_did  VARCHAR     PRIMARY KEY"},
+         {access_url,       "access_url         TEXT"},
+         {access_format,    "access_format      VARCHAR"},
+         {access_estsize,   "access_estsize     BIGINT"},
+         {target_name,      "target_name        VARCHAR"},
+         {s_ra,             "s_ra               double precision"},
+         {s_dec,            "s_dec              double precision"},
+         {s_fov,            "s_fov              double precision"},
+         {s_region,         "s_region           VARCHAR"},
+         {s_xel1,           "s_xel1             bigint"},
+         {s_xel2,           "s_xel2             bigint"},
+         {s_resolution,     "s_resolution       double precision"},
+         {t_min,            "t_min              double precision"},
+         {t_max,            "t_max              double precision"},
+         {t_exptime,        "t_exptime          double precision"},
+         {t_resolution,     "t_resolution       double precision"},
+         {t_xel,            "t_xel              bigint"},
+         {em_min,           "em_min             double precision"},
+         {em_max,           "em_max             double precision"},
+         {em_res_power,     "em_res_power       double precision"},
+         {em_xel,           "em_xel             bigint"},
+         {o_ucd,            "o_ucd              VARCHAR"},
+         {pol_states,       "pol_states         VARCHAR"},
+         {pol_xel,          "pol_xel            bigint"},
+         {facility_name,    "facility_name      VARCHAR"},
+         {instrument_name,  "instrument_name    VARCHAR"},
+         {coordinates,      "coordinates        spoint"},
+         {polygon_region_galactic,   "polygon_region_galactic     spoly"},
+         {polygon_region,   "polygon_region     spoly"},
+         {proposal_id,      "proposal_id        VARCHAR"},
+         {policy ,          "policy             auth_policy NOT NULL"},
+         {groups,           "groups             TEXT[]      NULL"}
+      };
+
+
+   initCols();
+}
+
+
+void SqlSchema::initCols(void)
+{
+//   using namespace columns;
+
+   m_columns[headers]         = "(" + headersRow.concat_val_first_word() + ")";
+   m_columns[boundsgal]       = "(" + boundsgalRow.concat_val_first_word() + ")";
+   m_columns[boundsicrs]      = "(" + boundsicrsRow.concat_val_first_word() + ")";
+   m_columns[verticesgal]     = "(" + verticesgalRow.concat_val_first_word() + ")";
+   m_columns[verticesicrs]    = "(" + verticesicrsRow.concat_val_first_word() + ")";
+   m_columns[obscore]         = "(" + obscoreRow.concat_val_first_word() + ")";
+};
+
+
+
+void SqlSchema::appendRow(void)
+{
+//   using namespace columns;
+
+   m_values[headers]     += "(" + headersRow.concat_val_all()  + "),";
+   m_values[boundsgal]   += "(" + boundsgalRow.concat_val_all()   + "),";
+   m_values[boundsicrs]  += "(" + boundsicrsRow.concat_val_all()   + "),";
+   m_values[verticesgal] += "(" + verticesgalRow.concat_val_all() + "),";
+   m_values[verticesicrs] += "(" + verticesicrsRow.concat_val_all() + "),";
+   m_values[obscore]     += "(" + obscoreRow.concat_val_all()  + "),";
+};
+
+
+
+
+
+// skip last character
+string SqlSchema::skl(string str)
+{
+   return str.substr(0, str.size() - 1);
+}
+
+
+
+
+#if 0
+vector<string> SqlSchema::concatedValues(void)
+{
+   vector<string> vVals;
+   std::map<Tables, std::string>::const_iterator it = m_values.begin();
+   while(it != m_values.end())
+   {
+      Tables t = it->first;
+      string cmd =  it->second;
+
+      vVals.push_back(cmd);
+      it++;
+   }
+   return vVals;
+}
+
+
+vector<string> SqlSchema::map2vect(const map<SqlSchema::Tables, string>& amap)
+{
+   vector<string> vVals;
+   std::map<SqlSchema::Tables, std::string>::const_iterator it = amap.begin();
+   while(it != amap.end())
+   {
+      SqlSchema::Tables t = it->first;
+      string cmd =  it->second;
+
+      vVals.push_back(cmd);
+      it++;
+   }
+   return vVals;
+}
+
+vector<string> SqlSchema::concatedColumns(void)
+{
+   return map2vect(m_columns);
+}
+#endif
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSchema.hpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..a9eae6a91a341cf6544548510db1da15d8e41c27
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema.hpp
@@ -0,0 +1,89 @@
+
+#ifndef SQLSCHEMA_HPP
+#define SQLSCHEMA_HPP
+
+#include <string>
+#include <vector>
+
+#include "colmap.hpp"
+
+//---------------------------------------------------------
+// multivalued command(s) return vector of string:
+// * if database is implemented in more tables, returned
+//   vector contains commands for each table and all commands
+//   should be executed as transaction
+// * if database is in one table, vector will have one
+//   string only
+//---------------------------------------------------------
+
+
+class SqlSchema
+{
+   public:
+
+      SqlSchema();
+      void initCols(void);
+      void appendRow(void);
+
+
+      // groups of columns
+
+      enum class headersColId {header_id, filename, hdunum, pubdid, header, survey_id};
+      enum boundsgalColId {bgheader_id, lfrom, lto, lunit, bfrom, bto, bunit, vfrom, vto, vunit};
+      enum boundsicrsColId {biheader_id, rfrom, rto, runit, dfrom, dto, dunit};
+      enum verticesgalColId {vgheader_id, p1lon, p1lat, p2lon, p2lat, p3lon, p3lat, p4lon, p4lat};
+      enum verticesicrsColId {viheader_id, p1ra, p1dec, p2ra, p2dec, p3ra, p3dec, p4ra, p4dec};
+      enum obscoreColId {dataproduct_type, calib_level,
+         obs_collection, obs_id, obs_publisher_id,
+         access_url, access_format, access_estsize, target_name,
+         s_ra, s_dec, s_fov, s_region, s_xel1, s_xel2, s_resolution,
+         t_min, t_max, t_exptime, t_resolution, t_xel,
+         em_min, em_max, em_res_power, em_xel,
+         o_ucd,
+         pol_states, pol_xel,
+         facility_name, instrument_name,
+         polygon_region_galactic, polygon_region, coordinates,
+         proposal_id,
+         policy, groups};
+
+      colmap<headersColId,      std::string> headersRow;
+      colmap<boundsgalColId,    std::string> boundsgalRow;
+      colmap<boundsicrsColId,   std::string> boundsicrsRow;
+      colmap<verticesgalColId,  std::string> verticesgalRow;
+      colmap<verticesicrsColId, std::string> verticesicrsRow;
+      colmap<obscoreColId,      std::string> obscoreRow;
+
+
+      // organize columns into tables
+
+      enum Tables {headers,
+         boundsgal, boundsicrs, verticesgal, verticesicrs,
+         obscore};
+
+      std::map<Tables, std::string> m_columns;
+      std::map<Tables, std::string> m_values;
+
+      // utils
+
+      std::string eto_string(enum headersColId cid) {return headersRow[cid].substr(0, headersRow[cid].find(' ')); };
+      std::string eto_string(enum boundsgalColId cid) {return boundsgalRow[cid].substr(0, boundsgalRow[cid].find(' ')); };
+
+      std::string eto_string(enum Tables tid) {return tableNames[tid].substr(0, tableNames[tid].find(' ')); };
+
+      void reset(void);
+      std::string skl(std::string str);
+      // skip last character
+
+      std::map<Tables, std::string> tableNames{
+         {headers , "headers"},
+         {boundsgal, "cubeboundsgalactic"},
+         {boundsicrs, "cubeboundsicrs"},
+         {verticesgal, "verticesgalactic"},
+         {verticesicrs, "verticesicrs"},
+         {obscore, "obscore"},
+      };
+
+};
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_CREATE.cpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_CREATE.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..02de746c3a7d1a5fa1c289ae011ad220381df012
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_CREATE.cpp
@@ -0,0 +1,33 @@
+
+
+#include "SqlSchema_CREATE.hpp"
+
+using namespace std;
+
+SqlSchema_CREATE::SqlSchema_CREATE(void)
+{
+   appendRow();
+}
+
+// skip first '(' and last ',' character
+// FIXME this is no good
+string skfl(string str)
+{
+   return str.substr(1, str.size() - 2);
+}
+
+
+vector<string> SqlSchema_CREATE::getCREATE(void)
+{
+   vector<string> vCREATE{
+    /*  {"DROP TABLE IF EXISTS headers CASCADE; CREATE TABLE headers (" + skfl(m_values[headers])},
+         {"DROP TABLE IF EXISTS cubeboundsgalactic CASCADE; CREATE TABLE cubeboundsgalactic (" + skfl(m_values[boundsgal])},
+         {"DROP TABLE IF EXISTS cubeboundsicrs CASCADE; CREATE TABLE cubeboundsicrs (" + skfl(m_values[boundsicrs])},
+         {"DROP TABLE IF EXISTS verticesgalactic CASCADE; CREATE TABLE verticesgalactic (" + skfl(m_values[verticesgal])},
+         {"DROP TABLE IF EXISTS verticesicrs CASCADE; CREATE TABLE verticesicrs (" + skfl(m_values[verticesicrs])},
+    */
+         {"DROP TABLE IF EXISTS obscore CASCADE; CREATE TABLE obscore " + skl(m_values[obscore])},
+   };
+   return vCREATE;
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_CREATE.hpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_CREATE.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..414af65819cec007ba591c16bee7012464cff398
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_CREATE.hpp
@@ -0,0 +1,20 @@
+#ifndef SQLSCHEMA_CREATE_HPP
+#define SQLSCHEMA_CREATE_HPP
+
+#include <string>
+#include <vector>
+
+#include "SqlSchema.hpp"
+
+
+class SqlSchema_CREATE : public SqlSchema
+{
+   public:
+
+   SqlSchema_CREATE(void);
+
+   std::vector<std::string> getCREATE(void);
+};
+
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_DELETE.cpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_DELETE.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..b9043731c7ac86aa70107542a7edd2f8ed2d1063
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_DELETE.cpp
@@ -0,0 +1,44 @@
+
+
+#include "SqlSchema_DELETE.hpp"
+
+
+using namespace std;
+
+SqlSchema_DELETE::SqlSchema_DELETE(void)
+{
+// FIXME ????? appendRow();
+}
+
+
+vector<string> SqlSchema_DELETE::getCommand(int sid, const Survey& surv)
+{
+   string sidStr     = to_string(sid);
+   string obsCollStr = surv.getObsCollection();// FIXME + to_sqlstring(surv.getObsCollection()); and remove apostrophes below...
+
+   vector<string> vDELETE{
+/*      {"DELETE FROM cubeboundsgalactic WHERE header_id IN (SELECT header_id FROM headers WHERE survey_id = " + sidStr + ")"},
+      {"DELETE FROM cubeboundsicrs     WHERE header_id IN (SELECT header_id FROM headers WHERE survey_id = " + sidStr + ")"},
+      {"DELETE FROM verticesgalactic   WHERE header_id IN (SELECT header_id FROM headers WHERE survey_id = " + sidStr + ")"},
+      {"DELETE FROM verticesicrs       WHERE header_id IN (SELECT header_id FROM headers WHERE survey_id = " + sidStr + ")"},
+      {"DELETE FROM headers            WHERE survey_id = " + sidStr},
+*/
+      {"DELETE FROM obscore WHERE obs_collection LIKE \'" + obsCollStr + "\'"},
+   };
+   return vDELETE;
+}
+
+/* NOTE:
+
+obscore handled separately
+obscore_vlkb handled separately
+
+static relative to for-cycle: sid & tabName[headers]
+
+for(tableName : tableNames)
+{
+   cmdString = "DELETE FROM " + tableName + " WHERE header_id IN (SELECT header_id FROM " + tabName[headers] + " WHERE survey_id = " + to_string(sid) + ")"
+   vCmd.push_back(cmdString);
+}
+
+*/
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_DELETE.hpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_DELETE.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..43e26746e35779f08605f1c7f5963d0037dcb1e5
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_DELETE.hpp
@@ -0,0 +1,21 @@
+#ifndef SQLSCHEMA_DELETE_HPP
+#define SQLSCHEMA_DELETE_HPP
+
+#include <string>
+#include <vector>
+
+#include "SqlSchema.hpp"
+#include "SqlSurvey.hpp"
+
+
+class SqlSchema_DELETE : public SqlSchema
+{
+   public:
+
+   SqlSchema_DELETE(void);
+
+   std::vector<std::string> getCommand(int sid, const Survey& surv);
+};
+
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_INSERT.cpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_INSERT.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..e929312f09092e834d872d6999de6d0f4e32b08c
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_INSERT.cpp
@@ -0,0 +1,399 @@
+
+#include "SqlSchema_INSERT.hpp"
+
+#include <string>
+#include <stdexcept>
+#include <math.h> // fabs needed
+
+#include <algorithm>
+#include <vector>
+
+#include "dataset_id.hpp"
+#include "ObsCoreTime.hpp"
+#include "io.hpp"
+#include "ast4vl.hpp"
+#include "my_assert.hpp"
+
+
+
+using namespace std;
+
+
+//----------------------------------------------------------------
+// utils
+//----------------------------------------------------------------
+
+string authPolicyToSQLEnum(string csvPolicy)
+{
+   if((csvPolicy.compare("FREE") == 0) ||
+      (csvPolicy.compare("PRIV") == 0)) return csvPolicy;
+
+
+   if(csvPolicy.compare("PUBLIC") == 0)
+   {
+      return "FREE";
+   }
+   else if(csvPolicy.compare("PRIVATE") == 0)
+   {
+      return "PRIV";
+   }
+   else
+   {
+      LOG_STREAM << "authPolicyToSQLEnum: unrecoginzed csv-Policy: " + csvPolicy << endl;
+      throw std::invalid_argument( "unrecognized auth-Policy string: " + csvPolicy );
+   }
+}
+
+/* order vertices for polygon */
+
+bool comp_vertex(point2d a, point2d b)
+{
+   return ( atan2(a.lat, a.lon) < atan2(b.lat, b.lon) );
+}
+
+vector<point2d> to_point2dvec(int len, double * vertlon, double * vertlat)
+{
+   vector<point2d> vertex;
+   int ii;
+   for(ii=0; ii<len; ii++) {vertex.push_back(point2d{vertlon[ii], vertlat[ii]});}
+   return vertex;
+}
+
+/* re-orders points in lon,lat arrays to form a convex polygon (no lines cross) */
+/* FIXME this re-ordering is not needed if: 
+ * algorithm to generate vertices in 'ast::frameset::sky_vertices(void)' would
+ * generate them in ordered for polygons ; 
+ * it is not expected that projection changes the angular order of vertices */
+void reorder_vertices(vector<point2d>& vertex)
+{
+   /* calc center */
+
+   double clon=0.0, clat=0.0;
+   for(point2d vert : vertex)
+   {
+      clon += vert.lon;
+      clat += vert.lat;
+   }
+   double count = (double)vertex.size();
+   clon /= count;
+   clat /= count;
+
+   /* shif vertices to coord-system at center, sort, and shift back */
+   
+   int ii;
+   for(ii=0; ii<count; ii++) {vertex[ii].lon -= clon; vertex[ii].lat -= clat;}
+
+   sort(vertex.begin(), vertex.end(), comp_vertex);
+
+   for(ii=0; ii<count; ii++) {vertex[ii].lon += clon; vertex[ii].lat += clat;}
+   
+   return;
+}
+
+string region_spoly(vector<point2d> vert)
+{
+   my_assert((vert.size()==4), __FILE__,__LINE__, "expected 4 vertices, but found " + to_string(vert.size()) );
+
+   string spoly = 
+      "{ " 
+      "(" + to_string(vert[0].lon) + "d," + to_string(vert[0].lat) + "d),"
+      "(" + to_string(vert[1].lon) + "d," + to_string(vert[1].lat) + "d),"
+      "(" + to_string(vert[2].lon) + "d," + to_string(vert[2].lat) + "d),"
+      "(" + to_string(vert[3].lon) + "d," + to_string(vert[3].lat) + "d)"
+      " }";
+
+   LOG_STREAM << "spoly: " << spoly << endl;
+
+   return spoly;
+}
+
+string region_stcs(vector<point2d> vert)
+{
+   my_assert((vert.size()==4), __FILE__,__LINE__, "expected 4 vertices, but found " + to_string(vert.size()) );
+
+   string stcs = 
+      "Polygon ICRS " 
+      + to_string(vert[0].lon) + " " + to_string(vert[0].lat) + " " 
+      + to_string(vert[1].lon) + " " + to_string(vert[1].lat) + " " 
+      + to_string(vert[2].lon) + " " + to_string(vert[2].lat) + " " 
+      + to_string(vert[3].lon) + " " + to_string(vert[3].lat) + " " 
+      " unit deg";
+
+   LOG_STREAM << "stcs:" << stcs << endl;
+
+   return stcs;
+}
+
+string to_sqlstring(string str) {return "\'" + str + "\'";}
+
+// FIXME replace this
+string asSqlString(fitsfiles::key_values_by_type key_values, string key)
+{
+   if(key_values.strValues.count(key)>0)
+   {
+      return "\'" + key_values.strValues[key] + "\'";
+   }
+   else if(key_values.uintValues.count(key)>0)
+   {
+      return to_string(key_values.uintValues[key]);
+   }
+   else if(key_values.doubleValues.count(key)>0)
+   {
+      return to_string(key_values.doubleValues[key]);
+   }
+   else
+   {
+      return "NULL";
+   }
+}
+
+
+
+
+//----------------------------------------------------------------------
+// public API
+//----------------------------------------------------------------------
+void SqlSchema_INSERT::appendRow(/*const int hid, const int sid,*/
+      const string& obscore_publisher,
+      const string& obscore_access_format,
+      const string& obscore_access_url,
+      const Survey& surv, const string& authGroups,
+      /*const*/ fitsfiles::Hdu& hdu,
+      //n  ObsCoreKeys ocKeys,
+      const std::string& filename, const uintmax_t filesize)
+{
+   LOG_trace(__func__);
+
+   // header based computations
+
+#if 0
+   const int specSystem = 1;// LSRK
+   vector<struct Bounds> galBounds = 
+      legacy::calcBounds(hdu.m_header, "GALACTIC", specSystem);
+   vector<struct Bounds> icrsBounds = 
+      legacy::calcBounds(hdu.m_header, "ICRS", specSystem);
+   struct Vertices galVerts  = legacy::calcVertices(hdu.m_header, "GALACTIC");
+   struct Vertices icrsVerts = legacy::calcVertices(hdu.m_header, "ICRS");
+#else
+   const string VELOLSRK{"System=VELO,StdOfRest=LSRK,Unit=km/s"};
+
+   vector<struct Bounds> galBounds  = calc_bounds(hdu.m_header, "GALACTIC", VELOLSRK);
+   vector<struct Bounds> icrsBounds = calc_bounds(hdu.m_header, "ICRS", VELOLSRK);
+
+   vector<point2d> galVerts  = calc_skyvertices(hdu.m_header, "GALACTIC");
+   vector<point2d> icrsVerts = calc_skyvertices(hdu.m_header, "ICRS");
+
+   reorder_vertices(galVerts);
+   reorder_vertices(icrsVerts);
+#endif
+
+   // construct publisherDID
+
+   bool scramble = false;
+   const std::string pubdid(dataset_id::create(surv.storagePath,filename,hdu.m_hdunum, scramble));
+
+   ////////////////////////////////////////////////////////////////////////
+
+   // set table values
+/*
+   headersRow[headersColId::header_id] = to_string(hid);
+   headersRow[headersColId::filename] = to_sqlstring(filename);
+   headersRow[headersColId::hdunum]   = to_string(hdu.m_hdunum);
+   headersRow[headersColId::pubdid]   = to_sqlstring(pubdid);
+   headersRow[headersColId::header]   = "$$" + hdu.m_header + "$$";
+   headersRow[headersColId::survey_id] = to_string(sid);
+
+   // cubebounds - GALACTIC
+
+   boundsgalRow[bgheader_id] = to_string(hid);
+   boundsgalRow[lfrom] = to_string(galBounds[0].low);//.x_from);
+   boundsgalRow[lto]   = to_string(galBounds[0].up);//.x_to);
+   boundsgalRow[lunit] = to_sqlstring(galBounds[0].unit);//x_unit);
+   boundsgalRow[bfrom] = to_string(galBounds[1].low);//y_from);
+   boundsgalRow[bto]   = to_string(galBounds[1].up);//y_to);
+   boundsgalRow[bunit] = to_sqlstring(galBounds[1].unit);//y_unit);
+   if(galBounds.size() >2)
+   {
+      boundsgalRow[vfrom] = to_string(galBounds[2].low);//vel_from);
+      boundsgalRow[vto]   = to_string(galBounds[2].up);//vel_to);
+      boundsgalRow[vunit] = to_sqlstring(galBounds[2].unit);//vel_unit);
+   }
+   else
+   {
+      boundsgalRow[vfrom] = "NULL";
+      boundsgalRow[vto]   = "NULL";
+      boundsgalRow[vunit] = "NULL";
+   }
+
+   // cubebounds - ICRS
+
+   boundsicrsRow[biheader_id] = to_string(hid);
+   boundsicrsRow[rfrom] = to_string(icrsBounds[0].low);//x_from);
+   boundsicrsRow[rto]   = to_string(icrsBounds[0].up);//x_to);
+   boundsicrsRow[runit] = to_sqlstring(icrsBounds[0].unit);//x_unit);
+   boundsicrsRow[dfrom] = to_string(icrsBounds[1].low);//y_from);
+   boundsicrsRow[dto]   = to_string(icrsBounds[1].up);//y_to);
+   boundsicrsRow[dunit] = to_sqlstring(icrsBounds[1].unit);//y_unit);
+
+   // vertices - GALACTIC
+
+   verticesgalRow[vgheader_id] = to_string(hid);
+   verticesgalRow[p1lon] = to_string(galVerts[0].lon);
+   verticesgalRow[p1lat] = to_string(galVerts[0].lat);
+   verticesgalRow[p2lon] = to_string(galVerts[1].lon);
+   verticesgalRow[p2lat] = to_string(galVerts[1].lat);
+   verticesgalRow[p3lon] = to_string(galVerts[2].lon);
+   verticesgalRow[p3lat] = to_string(galVerts[2].lat);
+   verticesgalRow[p4lon] = to_string(galVerts[3].lon);
+   verticesgalRow[p4lat] = to_string(galVerts[3].lat);
+
+
+   // vertices - ICRS
+
+   verticesicrsRow[viheader_id] = to_string(hid);
+   verticesicrsRow[p1ra]  = to_string(icrsVerts[0].lon);
+   verticesicrsRow[p1dec] = to_string(icrsVerts[0].lat);
+   verticesicrsRow[p2ra]  = to_string(icrsVerts[1].lon);
+   verticesicrsRow[p2dec] = to_string(icrsVerts[1].lat);
+   verticesicrsRow[p3ra]  = to_string(icrsVerts[2].lon);
+   verticesicrsRow[p3dec] = to_string(icrsVerts[2].lat);
+   verticesicrsRow[p4ra]  = to_string(icrsVerts[3].lon);
+   verticesicrsRow[p4dec] = to_string(icrsVerts[3].lat);
+*/
+
+   // obscore - values from Surveys table
+
+   string keyFacility      = surv.fitskeyFacilityName;
+   string keyInstrument    = surv.fitskeyInstrumentName;
+
+   obscoreRow[dataproduct_type] = to_sqlstring(surv.dataproductType);
+   obscoreRow[calib_level]      = to_string(surv.calibLevel);
+   obscoreRow[obs_collection]   = to_sqlstring(surv.getObsCollection());
+   obscoreRow[o_ucd]            = to_sqlstring(surv.oUcd);
+   obscoreRow[policy]           = to_sqlstring(authPolicyToSQLEnum(surv.authPolicy)) + "::auth_policy";
+
+
+   // obscore - values derived from filename
+
+   obscoreRow[obs_id]           = to_sqlstring(filename.substr(0, filename.find_last_of(".")));
+   obscoreRow[obs_publisher_id] = obscore_publisher.empty() ? to_sqlstring(pubdid) : to_sqlstring(obscore_publisher + "?" + pubdid);
+   obscoreRow[access_url]       = obscore_access_url.empty() ? "" : to_sqlstring(
+         obscore_access_url + "/" + surv.storagePath + "/" + filename);
+   obscoreRow[access_format]    = to_sqlstring(obscore_access_format);
+   obscoreRow[access_estsize]   = to_string(filesize/1024); // [KB]
+
+
+   // obscore - values taken from header cards
+
+   obscoreRow[facility_name]    = asSqlString(hdu.key_values, keyFacility);
+   obscoreRow[instrument_name]  = asSqlString(hdu.key_values, keyInstrument);
+   obscoreRow[target_name]      = asSqlString(hdu.key_values, "OBJECT");
+   obscoreRow[s_xel1]           = asSqlString(hdu.key_values, "NAXIS1");
+   obscoreRow[s_xel2]           = asSqlString(hdu.key_values, "NAXIS2");
+   obscoreRow[s_resolution]     = "NULL"; // double FIXME CDELT+CUNIT->arcsec ? which axis?
+
+
+   // obscore - vals computed from header
+
+#define max(a,b) (a>b ? a : b)
+   /*   double ds_ra  = (icrsBounds.x_from + icrsBounds.x_to)/2.0;
+        double ds_dec = (icrsBounds.y_from + icrsBounds.y_to)/2.0;
+        double ds_fov = max( fabs(icrsBounds.x_from - icrsBounds.x_to),
+        fabs(icrsBounds.y_from - icrsBounds.y_to) );
+        */
+   double ds_ra  = (icrsBounds[0].low + icrsBounds[0].up)/2.0;
+   double ds_dec = (icrsBounds[1].low + icrsBounds[1].up)/2.0;
+   double ds_fov = max( fabs(icrsBounds[0].low - icrsBounds[0].up),
+         fabs(icrsBounds[1].low - icrsBounds[1].up) );
+
+
+   obscoreRow[s_ra]  = to_string(ds_ra);
+   obscoreRow[s_dec] = to_string(ds_dec);
+   obscoreRow[s_fov] = to_string(ds_fov);
+
+   obscoreRow[s_region] = to_sqlstring(region_stcs(icrsVerts));
+
+   // polygon_region & coordinates are IA2-extensions
+   obscoreRow[polygon_region_galactic] = to_sqlstring(region_spoly(galVerts));
+   obscoreRow[polygon_region] = to_sqlstring(region_spoly(icrsVerts));
+   obscoreRow[coordinates]    = to_sqlstring("(" + obscoreRow[s_ra] + "," + obscoreRow[s_dec] + ")");
+
+
+   // obscore - spectral axis
+
+   if(icrsBounds.size() >= 3) // 3D cubes
+   {
+      obscoreRow[em_min]        = to_string(galBounds[2].low);
+      obscoreRow[em_max]        = to_string(galBounds[2].up);
+      obscoreRow[em_res_power]  = asSqlString(hdu.key_values, "CDELT3");
+      obscoreRow[em_xel]        = asSqlString(hdu.key_values, "NAXIS3");
+   }
+#if 0
+   else if(icrsBounds.naxis == 2) // 2D images
+   {
+      if(!strcmp(survey.survSpecies.c_str(), "Continuum"))
+         dem_min = dem_max = 0.0; // FIXME transtod(psurv->transition);
+   }
+#endif
+   else
+   {
+      obscoreRow[em_min]       = "NULL";
+      obscoreRow[em_max]       = "NULL";
+      obscoreRow[em_res_power] = "NULL";
+      obscoreRow[em_xel]       = "NULL";
+   }
+
+
+   // obscore - time
+
+   string DATEOBS;
+   string DATEEND;
+
+   if(hdu.key_values.strValues["DATE-OBS"].size() > 0) DATEOBS = hdu.key_values.strValues["DATE-OBS"];
+   if(hdu.key_values.strValues["DATE-END"].size() > 0) DATEEND = hdu.key_values.strValues["DATE-END"];
+
+   obscore::ObsCoreTime time =
+      obscore::calcObsCoreTime(hdu.key_values.strValues["DATE-OBS"], hdu.key_values.strValues["DATE-END"]);
+
+   obscoreRow[t_min]        = time.t_min_str; //to_string(time.t_min);
+   obscoreRow[t_max]        = time.t_max_str; //to_string(time.t_max);
+   obscoreRow[t_exptime]    = to_string(time.t_exptime);
+   obscoreRow[t_resolution] = to_string(time.t_resolution);
+   obscoreRow[t_xel]        = to_string(time.t_xel);
+
+
+   // obscore - stokes params
+
+   obscoreRow[pol_states] = "NULL";
+   obscoreRow[pol_xel]    = "NULL";
+
+
+   // obscore - misc
+
+   obscoreRow[proposal_id] = "NULL";
+   obscoreRow[obscoreColId::groups] = to_sqlstring("{" + authGroups + "}");
+
+   // end: all data set
+
+
+   SqlSchema::appendRow();
+}
+
+
+
+
+vector<string> SqlSchema_INSERT::getINSERT(void)
+{
+   vector<string> vCmds{
+  /*    {"INSERT INTO headers " + m_columns[headers]        + " VALUES "             + skl(m_values[headers])},
+         {"INSERT INTO cubeboundsgalactic " + m_columns[boundsgal]      + " VALUES "  + skl(m_values[boundsgal])},
+         {"INSERT INTO cubeboundsicrs " + m_columns[boundsicrs]     + " VALUES "      + skl(m_values[boundsicrs])},
+         {"INSERT INTO verticesgalactic " + m_columns[verticesgal]    + " VALUES "    + skl(m_values[verticesgal])},
+         {"INSERT INTO verticesicrs " + m_columns[verticesicrs]   + " VALUES "        + skl(m_values[verticesicrs])},
+  */	 
+         {"INSERT INTO obscore " + m_columns[obscore]        + " VALUES "             + skl(m_values[obscore])},
+   };
+   reset();
+   return vCmds;
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_INSERT.hpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_INSERT.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..1a9e915c0f40fc066ee2738aaa18fb2d3de4fcf1
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSchema_INSERT.hpp
@@ -0,0 +1,35 @@
+#ifndef SQLSCHEMA_INSERT_HPP
+#define SQLSCHEMA_INSERT_HPP
+
+#include <string>
+#include <vector>
+
+#include "SqlSchema.hpp"
+#include "SqlSurvey.hpp"
+
+#include "fitsfiles.hpp" // Hdu needed
+#include "ObsCoreKeys.hpp"
+
+
+std::string createPubdid(const std::string path, const std::string filename, unsigned int hdunum);
+
+class SqlSchema_INSERT : public SqlSchema
+{
+   public:
+
+      void appendRow(/*const int hid, const int sid,*/
+            const std::string& obscore_publisher,
+            const std::string& obscore_access_format,
+            const std::string& obscore_access_url,
+            const Survey& surv, const std::string& authGroups,
+            /*const*/ fitsfiles::Hdu& hdu,
+//            ObsCoreKeys ocKeys,
+            const std::string& filename, const uintmax_t filesize);
+
+
+      std::vector<std::string> getINSERT(void);
+
+};
+
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSurvey.cpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSurvey.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..2f45de03b8be407bb598106ec529b27cb94023d3
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSurvey.cpp
@@ -0,0 +1,176 @@
+
+#include "parse_surveys_csv.hpp"
+
+#include <iostream>
+#include <string>
+#include <algorithm> // replace() needed
+
+#include "SqlSurvey.hpp"
+
+#include "io.hpp"
+
+using namespace std;
+using namespace survey_ns;
+
+
+
+colmap<ColumnId,string> Survey::m_row
+= {
+   {name,               "name              varchar(24)     DEFAULT NULL"},
+   {species,            "species           varchar(24)     DEFAULT NULL"},
+   {transition,         "transition        varchar(32)     DEFAULT NULL"},
+   {rest_frequency,     "rest_frequency    double precision DEFAULT NULL"},
+   {restf_fits_unit,    "restf_fits_unit   varchar(8)      DEFAULT NULL"},
+   {velocity_fits_unit, "velocity_fits_unit varchar(12)    DEFAULT NULL"},
+   {storage_path,       "storage_path      varchar(1024)   DEFAULT NULL"},
+   {file_filter,        "file_filter       varchar(255)    DEFAULT NULL"},
+   {description,        "description       varchar(4096)   DEFAULT NULL"},
+   {dataproduct_type,   "dataproduct_type  varchar"},
+   {calib_level,        "calib_level       integer         NOT NULL"},
+   {o_ucd,              "o_ucd             varchar"},
+   {fitskey_facility_name,    "fitskey_facility_name     varchar(12)    DEFAULT NULL"},
+   {fitskey_instrument_name,  "fitskey_instrument_name   varchar(12)    DEFAULT NULL"},
+   {auth_policy,        "auth_policy    varchar(8)        NOT NULL"}
+};
+
+
+
+
+Survey::Survey(string tableName):tabName{tableName}
+{
+   LOG_trace(__func__);
+
+   string firstField("survey_id SERIAL,");
+   string primaryKey(", PRIMARY KEY (survey_id)");
+
+   m_CREATE.clear();
+   m_CREATE.append("DROP TABLE IF EXISTS " + tableName + "; ");
+   m_CREATE.append("CREATE TABLE " + tableName + " (");
+   m_CREATE.append(firstField);
+   m_CREATE.append(m_row.concat_val_all());
+   m_CREATE.append(primaryKey);
+   m_CREATE.append(" )");
+};
+
+
+
+string Survey::getCREATE(void)
+{
+   return m_CREATE;
+};
+
+
+// INSERT into surveys table
+
+string to_sqlstring2(string str) {return "\'" + str + "\'";}
+// FIXME duplicate: exists in vlkb/datasets/SqlSchema_INSERT
+
+string to_sql_values(survey surv)
+{
+   return "("
+      + to_string(surv.survey_id) + ", "
+      + to_sqlstring2(surv.name) + ", "
+      + to_sqlstring2(surv.species) + ", "
+      + to_sqlstring2(surv.transition) + ", "
+      + to_string(surv.rest_frequency_Hz) + ", "
+      + to_sqlstring2("") + ", "
+      // FIXME NOT USED + to_sqlstring2(surv.restf_fits_unit) + ", "
+      + to_sqlstring2(to_string(surv.velocity_fits_unit)) + ", "
+      + to_sqlstring2(surv.storage_path) + ", "
+      + to_sqlstring2(surv.file_filter) + ", "
+      + to_sqlstring2(surv.description) + ", "
+      + to_sqlstring2(to_string(surv.dataproduct_type)) + ", "
+      + to_string(to_uint(surv.calib_level)) + ", "
+      + to_sqlstring2(surv.o_ucd) + ", "
+      + to_sqlstring2(surv.fitskey_facility_name) + ", "
+      + to_sqlstring2(surv.fitskey_instrument_name) + ", "
+      + to_sqlstring2(to_string(surv.auth_policy)) + " "
+      "),";
+}
+
+string Survey::getINSERT(string surveysPathName)
+{
+   const bool skip_first_row = true;
+   vector<survey> sub_surveys{parse_surveys(surveysPathName, skip_first_row)};
+
+   string cmd_sql{"INSERT INTO " + tabName + " VALUES "};
+
+   for(survey ssurv : sub_surveys)
+      cmd_sql.append(to_sql_values(ssurv));
+   cmd_sql.pop_back(); // removes last comma
+
+   return cmd_sql;
+}
+
+
+
+std::string Survey::qry_SELECT_Attribs(int sid)
+{
+   return "SELECT " + m_row.concat_val_first_word()
+      + " FROM surveys WHERE survey_id = " + to_string(sid);
+}
+
+
+
+string Survey::getStorageFilter(void)
+{
+   string storageFilter = storagePath + "/" + fileFilter;
+   replace(storageFilter.begin(),storageFilter.end(),'%','*');// FIXME do in setFileFilter()
+   return storageFilter;
+}
+
+
+string Survey::getObsCollection(void) const
+{
+   return survName +" "+ survSpecies +" "+ survTransition;
+}
+
+
+
+
+
+
+
+
+string Survey::colName(ColumnId cid)
+{
+   return m_row[cid].substr(0, m_row[cid].find(' '));
+   // if no match found, find() returns string::npos -> end-of-the-string$
+}
+
+
+void Survey::handleResult(pqxx::result res)
+{
+   LOG_trace(__func__);
+
+   for(auto row: res)
+   {
+      for (auto field: row) LOG_STREAM << field.c_str() << " ";
+      LOG_STREAM << endl;
+
+      // FIXME found no confirm in pqxx doc that col order in row
+      // is the same as in list: SELECT list FROM...
+      // use explicit string column names instead enum
+      // FIXME 2 how to deal with empty SQL_fields ?
+      // for now set some default: func as<>(d) returns d if field empty (=null)
+      survName         = row[colName(name)      ].as<string>();
+      survSpecies      = row[colName(species)   ].as<string>();
+      survTransition   = row[colName(transition)].as<string>();
+
+      restFrequency    = row[colName(rest_frequency)    ].as<double>(0.0);
+      restFreqFitsUnit = row[colName(restf_fits_unit)   ].as<string>("");
+      velocityFitsUnit = row[colName(velocity_fits_unit)].as<string>("");
+      storagePath      = row[colName(storage_path)      ].as<string>();
+      fileFilter       = row[colName(file_filter)       ].as<string>();
+      survDescription  = row[colName(description)       ].as<string>();
+
+      dataproductType = row[colName(dataproduct_type)].as<string>();
+      calibLevel      = row[colName(calib_level)     ].as<int>();
+      oUcd            = row[colName(o_ucd)           ].as<string>("");
+      fitskeyFacilityName   = row[colName(fitskey_facility_name)].as<string>("");
+      fitskeyInstrumentName = row[colName(fitskey_instrument_name)].as<string>("");
+      authPolicy      = row[colName(auth_policy)     ].as<string>();
+
+   }
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/SqlSurvey.hpp b/data-access/engine/src/vlkb-obscore/src/database/SqlSurvey.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..a94a9c1d15a418ab9b85f9a8ae915f9a84f52be4
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/SqlSurvey.hpp
@@ -0,0 +1,73 @@
+#ifndef SQLSURVEY_HPP
+#define SQLSURVEY_HPP
+
+#include <string>
+#include "colmap.hpp"
+
+#include <pqxx/pqxx>
+
+
+
+namespace survey_ns {
+
+enum ColumnId
+      {
+         name, species, transition,
+         rest_frequency, restf_fits_unit, velocity_fits_unit,
+         storage_path, file_filter,
+         description,
+         dataproduct_type, calib_level, o_ucd,
+         fitskey_facility_name, fitskey_instrument_name,
+         auth_policy
+      };
+}
+
+class Survey
+{
+   public:
+
+      Survey(std::string tableName);
+      std::string getCREATE();
+      std::string getINSERT(std::string surveysPathName);
+      std::string qry_SELECT_Attribs(int sid);
+      void handleResult(pqxx::result res);
+
+      std::string getStorageFilter(void);
+      std::string getObsCollection(void) const;
+
+      // table fields
+
+      double restFrequency;    // FIXME check in which units ? [Hz] [GHz]?
+
+      std::string restFreqFitsUnit; // FIXME enum not string
+      std::string velocityFitsUnit; // FIXME enum not string
+
+      std::string storagePath;
+      std::string fileFilter;
+
+      std::string survDescription;
+
+      std::string survName;
+      std::string survSpecies;
+      std::string survTransition;
+
+      std::string dataproductType; // FIXME enum : cube | image
+      int calibLevel;
+      std::string oUcd;
+      std::string fitskeyFacilityName;
+      std::string fitskeyInstrumentName;
+      std::string authPolicy; // sql_enum FIXME here enum too
+
+   private:
+
+      std::string tabName;
+      static colmap<survey_ns::ColumnId,std::string> m_row;
+      std::string m_CREATE;
+
+      // utils
+
+      std::string colName(survey_ns::ColumnId cid);
+};
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/Sql_SELECT_SurveyBounds.cpp b/data-access/engine/src/vlkb-obscore/src/database/Sql_SELECT_SurveyBounds.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..5dc785b5f3d55e407997b10ce7512153eab7d623
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/Sql_SELECT_SurveyBounds.cpp
@@ -0,0 +1,20 @@
+
+#include "Sql_SELECT_SurveyBounds.hpp"
+
+using namespace std;
+
+string Sql_SELECT_SurveyBounds::qLonMinMax(int sid)
+{
+   return "SELECT min(lon_from),max(lon_to) FROM cubeboundsgalactic WHERE header_id IN "
+      "(SELECT header_id FROM headers WHERE survey_id = " + to_string(sid) + ")" ;
+}
+
+
+
+string Sql_SELECT_SurveyBounds::qLatMinMax(int sid)
+{
+   return "SELECT min(lat_from),max(lat_to) FROM cubeboundsgalactic WHERE header_id IN "
+      "(SELECT header_id FROM headers WHERE survey_id = " + to_string(sid) + ")";
+}
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/Sql_SELECT_SurveyBounds.hpp b/data-access/engine/src/vlkb-obscore/src/database/Sql_SELECT_SurveyBounds.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..346cfe650789aca1dd764bd529c6c625469fe4ca
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/Sql_SELECT_SurveyBounds.hpp
@@ -0,0 +1,13 @@
+#ifndef SQL_SELECT_SURVEYBOUNDS_HPP
+#define SQL_SELECT_SURVEYBOUNDS_HPP
+
+#include <string>
+
+namespace Sql_SELECT_SurveyBounds
+{
+   std::string qLonMinMax(int sid);
+   std::string qLatMinMax(int sid);
+}
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/colmap.hpp b/data-access/engine/src/vlkb-obscore/src/database/colmap.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..9de1d1e578051b985b4b76d7b220cbd536441dbc
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/colmap.hpp
@@ -0,0 +1,70 @@
+
+#ifndef COLMAP_HPP
+#define COLMAP_HPP
+
+#include <map>
+#include <string>
+#include <iostream>
+#include <initializer_list>
+
+template<typename Id, typename T>
+class colmap : public std::map<Id, T>
+{
+   public:
+
+      colmap<Id,T>(void):std::map<Id,T>(){};
+      colmap<Id,T>(std::initializer_list<std::pair<Id const,T>> il):std::map<Id,T>(il){};
+
+
+      std::string concat_val_all(const char separator = ',')
+      {
+         it = this->begin();
+
+         if(it==this->end()) return "";
+         // FIXME ho to handle this case ? -> exception?: there is nothing to concatinate
+
+         std::string fullstr(it->second);
+
+         it++;
+         while(it != this->end())
+         {
+            fullstr.append(separator + it->second);
+            it++;
+         }
+
+         return fullstr;
+      }
+
+      std::string concat_val_first_word(const char separator = ',')
+      {
+         it = this->begin();
+
+         if(it==this->end()) return "";
+         // FIXME ho to handle this case ? -> exception?: there is nothing to concatinate
+
+         std::string fullstr(first_word(it->second));
+
+         it++;
+         while(it != this->end())
+         {
+            fullstr.append(separator + first_word(it->second));
+            it++;
+         }
+
+         return fullstr;
+      }
+
+
+   private:
+      typename std::map<Id, std::string>::iterator it;
+
+      // utils
+      std::string first_word(std::string str)
+      {
+         // if no match found, find return string::npos -> end-of-the-string
+         return str.substr(0, str.find(' '));
+      }
+
+};
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/src/database/database.cpp b/data-access/engine/src/vlkb-obscore/src/database/database.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..b884a1be0a90bc5acd142424b01705443f5089c5
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/database.cpp
@@ -0,0 +1,368 @@
+
+#include "database.hpp"
+
+#include "DbConn.hpp"
+#include "SqlSchema.hpp"
+#include "SqlSchema_CREATE.hpp"
+#include "SqlSchema_INSERT.hpp"
+#include "SqlSchema_DELETE.hpp"
+#include "Sql_SELECT_SurveyBounds.hpp"
+#include "fitsfiles.hpp"
+#include "io.hpp"
+
+#include <iostream>
+#include <iomanip>
+#include <libgen.h> // basename() needed
+#include <iterator>
+
+
+using namespace std;
+
+// internal
+
+struct db_bounds
+{
+   unsigned int header_id;
+   double lon_from;
+   double lon_to;
+   double lat_from;
+   double lat_to;
+};
+
+vector<db_bounds> bounds_reversed(DbConn& db)
+{
+   LOG_trace(__func__);
+
+   const double default_when_null = 0.0;
+   struct db_bounds bounds;
+   vector<db_bounds> returned_bounds;
+
+   SqlSchema schema;
+   const string lon_from{schema.eto_string(SqlSchema::lfrom)};
+   const string lon_to  {schema.eto_string(SqlSchema::lto)};
+   const string lat_from{schema.eto_string(SqlSchema::bfrom)};
+   const string lat_to  {schema.eto_string(SqlSchema::bto)};
+   const string table_boundsgalactic{schema.eto_string(SqlSchema::boundsgal)};
+
+//--- assert LON LAT  from < to
+   string sql{
+      "SELECT header_id," + lon_from + "," + lon_to + "," + lat_from + "," + lat_to
+         + " FROM "+ table_boundsgalactic
+         + " WHERE ("
+         + lon_from + " > " + lon_to
+         + ") or ("
+         + lat_from + " > " + lat_to + ")"};
+
+//---- assert LON LAT are normalized
+   string sql2{
+      "SELECT header_id," + lon_from + "," + lon_to + "," + lat_from + "," + lat_to
+         + " FROM "+ table_boundsgalactic
+         + " WHERE ("
+         + lon_from + " > " + lon_to
+         + ") or ("
+           "(" + lon_from + " <  0) and ((" + lon_from + " < -180) or (" + lon_to + " >= 180 ))"
+           ") or ("
+           "(" + lon_from + " >= 0) and ((" + lon_from + " >= 360) or (" + lon_to + " >= 360))"
+           ") or ("
+         + lat_from + " > " + lat_to
+         + ") or ("
+           "(" + lat_from + " < -90 ) or (" + lat_from + " >= 90) or "
+           "(" + lat_to   + " < -90 ) or (" + lat_to   + " >= 90)"
+           ")"};
+//----
+
+
+   pqxx::result pqxx_result{db.dbExecQuery3(sql2)};
+
+   try
+   {
+      for (auto row: pqxx_result)
+      {
+         // FIXME take col-names SqlSchema_SELECT_bouns
+         bounds.header_id = row["header_id"].as<unsigned int>();
+         bounds.lon_from  = row[lon_from].as<double>(default_when_null);
+         bounds.lon_to    = row[lon_to].as<double>(default_when_null);
+         bounds.lat_from  = row[lat_from].as<double>(default_when_null);
+         bounds.lat_to    = row[lat_to].as<double>(default_when_null);
+
+         returned_bounds.push_back(bounds);
+      }
+   }
+   catch (std::exception const &e) 
+   {
+      LOG_STREAM << "ERR " + string(__FILE__) + "." + to_string(__LINE__) + ": " << e.what() << endl;
+   }
+
+   return returned_bounds;
+}
+
+
+// external
+
+//----------------------------------------------------------------------//
+// init db:
+// all tables will be deleted if existed
+// create surveys table and load its values from config::SurveysFile
+// create other tables empty
+//----------------------------------------------------------------------//
+void database::dbInit(const string db_uri, const string db_schema, const string surveys_csv_pathname)
+{
+   LOG_trace(__func__);
+
+   DbConn db(db_uri, db_schema);
+
+   Survey tabSurveys("surveys");
+
+   SqlSchema_CREATE cmdCreate;
+
+   vector<string> createAndLoadSurveys = { 
+      tabSurveys.getCREATE(),
+      tabSurveys.getINSERT(surveys_csv_pathname)
+   };
+
+   vector<string> createHeadersDb(cmdCreate.getCREATE());
+
+   db.dbExecCmds(createAndLoadSurveys);
+   db.dbExecCmds(createHeadersDb);
+};
+
+
+
+//----------------------------------------------------------------------//
+// check - verify consistency of database 
+// RULE1: each table must contain the same number of rows
+//----------------------------------------------------------------------//
+vector<string> database::dbCheck(const string db_uri, const string db_schema)
+{
+   DbConn db(db_uri, db_schema);
+
+   vector<string> result_strings;
+
+   // check 1: cube/image bounds in right order: from <= to
+
+/*  vector<db_bounds> reversed_bounds = bounds_reversed(db);
+
+   if(reversed_bounds.size() > 0) result_strings.push_back("assert from < to  failed in these rows:");
+   for(db_bounds one_row : reversed_bounds)
+   {
+      result_strings.push_back(
+            to_string(one_row.header_id)
+            + ": (" + to_string(one_row.lon_from)
+            + ", " + to_string(one_row.lon_to)
+            + ")   (" + to_string(one_row.lat_from)
+            + ", " + to_string(one_row.lat_to)
+            + ")"
+            );
+   }
+*/
+   // check 2: rows count equal in all tables
+
+   vector<string> tables = db.queryTableNames();
+
+   string qRowsCount = "SELECT ";
+   for(const string &table : tables)
+      qRowsCount += "(SELECT count(*) FROM " + table + ") as " + table + ",";
+   qRowsCount.erase(qRowsCount.size()-1);// remove last coma
+
+   vector<string> table_sizes{db.dbExecQuery2(qRowsCount)};
+
+   result_strings.insert(
+         result_strings.end(),
+         std::make_move_iterator(table_sizes.begin()),
+         std::make_move_iterator(table_sizes.end())
+         );
+
+   return result_strings;
+};
+
+
+
+
+
+
+
+
+
+//----------------------------------------------------------------------//
+// query Surveys Table for defined surveys
+// and list how many entries (FITS-filse) each Survey has in the system
+// (Headers, ObsCore, VerticesGalactic,... tables)
+// Side-effect: check that number of entries in different tables is equal, otherwise
+// tables are inconsistent; DB is in error state
+//----------------------------------------------------------------------//
+vector<string> database::dbListSurveys(const string db_uri, const string db_schema)
+{
+   LOG_trace(__func__);
+
+   DbConn db(db_uri, db_schema);
+
+   string qSidNameCount(
+         "SELECT survey_id,count(headers.survey_id), name,species,transition " 
+         "FROM headers "
+         // only common rows         "INNER JOIN surveys USING (survey_id) "
+         "RIGHT OUTER JOIN surveys USING (survey_id) " // all rows of surveys table
+         "GROUP BY survey_id,surveys.name,surveys.species,surveys.transition "
+         "ORDER BY surveys.survey_id");
+
+   vector<string> res_str;// = db.dbExecQuery2(qSidNameCount);
+   //pqxx::result = res = db.dbExecQuery3(qSidNameCount);
+   return res_str;
+};
+
+
+
+
+void database::dbSurveyBounds(/*int sid,*/ const string db_uri, const string db_schema)
+{
+   LOG_trace(__func__);
+
+   DbConn db(db_uri, db_schema);
+
+   // FIXME re-implement with obscore::s_region or polygon_region(_galactic)
+   //
+   //db.dbExecQuery(Sql_SELECT_SurveyBounds::qLonMinMax(sid));
+   //db.dbExecQuery(Sql_SELECT_SurveyBounds::qLatMinMax(sid));
+}
+
+
+
+
+string database::dbListFiles(int sid, const string db_uri, const string db_schema, const string fitsdir)
+{
+   LOG_trace(__func__);
+
+   DbConn db(db_uri, db_schema);
+
+   Survey surv = db.querySurveyAttributes(sid);
+   string storageFilter = surv.getStorageFilter();
+
+   std::vector<string> pathnames =
+      fitsfiles::globVector(fitsdir + "/" + storageFilter);
+
+//   LOG_STREAM << "SID[" << setw(3) <<sid << "]: " << setw(5) <<pathnames.size() << "    "
+//      << surv.survName << " " << surv.survSpecies << " " << surv.survTransition << endl;
+
+   return "SID[" + to_string(sid) + "]: " + to_string(pathnames.size()) + "    "
+      + surv.survName + " | " + surv.survSpecies + " | " + surv.survTransition;
+}
+
+
+
+
+// add survey
+// - get pattern from Surveys table
+// - list files by pattern
+// - for each file's HDU calc table-data
+// - add table-data as one row to each table
+void database::dbAddSurvey(int sid, const string groups,
+      const string obscore_publisher,
+      const string obscore_access_format,
+      const string obscore_access_url,
+      const string db_uri, const string db_schema, const string fitsdir, int max_hdupos)
+{
+   LOG_trace(__func__);
+
+   DbConn db(db_uri, db_schema); 
+
+   Survey surv = db.querySurveyAttributes(sid);
+   string storageFilter = surv.getStorageFilter();
+
+   std::vector<string> pathnames =
+      fitsfiles::globVector(fitsdir + "/" + storageFilter);
+
+   // if there are no files for sid-survey
+   if(pathnames.size() == 0)
+   {
+      LOG_STREAM << "Found " << pathnames.size() << " files in " << fitsdir + "/" + storageFilter << endl;
+      return;
+   }
+
+   // key-values for obscore-table
+   const fitsfiles::keys_by_type in_keys{
+         ObsCoreKeys::add_str_keys(ObsCoreKeys::strKeys, std::set<std::string>{surv.fitskeyFacilityName, surv.fitskeyInstrumentName}),
+         ObsCoreKeys::uintKeys,
+         ObsCoreKeys::doubleKeys};
+
+   // read all filtered pathnames and create 'INSERT INTO headers ...' SQL-statement
+
+   SqlSchema_INSERT cmdInsert;
+
+   //int max_hid = db.queryMaxHid();
+   //LOG_STREAM << "max_hid: " << max_hid << endl;
+
+   for(unsigned int i=0; i < pathnames.size(); i++)
+   {
+      string pathname = pathnames.at(i);
+      uintmax_t filesize(fitsfiles::fileSize(pathname));
+
+      string c_pathname = pathname;
+      string filename = basename((char*)c_pathname.c_str());
+
+      try
+      {
+
+         // read all HDU's
+
+         std::vector<fitsfiles::Hdu> allHdus = fitsfiles::fname2hdrstr(
+               pathname, max_hdupos,
+               &in_keys);
+
+         for(unsigned int i=0; i<allHdus.size(); i++)
+         {
+            fitsfiles::Hdu hdu = allHdus.at(i);
+
+            cmdInsert.appendRow(
+                  //0,//++max_hid,
+                  //sid,
+                  obscore_publisher,
+                  obscore_access_format,
+                  obscore_access_url,
+                  surv, groups, hdu, filename, filesize);
+         }
+      }
+      catch (std::exception const &e)
+      {
+         LOG_STREAM << "ERR " + string(__func__) + ": " << pathname << " : "<< e.what() << endl;
+      }
+   }
+
+   db.dbExecCmds(cmdInsert.getINSERT());
+
+}
+
+
+void database::dbModifyGroups(int sid, const string groups,
+      const string obscore_publisher,
+      const string db_uri, const string db_schema)
+{
+   LOG_trace(__func__);
+
+   DbConn db(db_uri, db_schema); 
+
+   vector<string> cmdModGroups{
+      "UPDATE obscore SET groups = '{" + groups + "}' WHERE (policy = 'PRIV') AND (obs_publisher_did IN (SELECT CONCAT('" + (obscore_publisher + "?") + "',pubdid) FROM headers WHERE survey_id =" + to_string(sid) + "))"};
+
+   db.dbExecCmds(cmdModGroups);
+}
+
+
+
+// FIXME hid identifies rows to remove in tables -> no good, should be pubdid as identificator
+void database::dbRemoveSurvey(int sid, const string db_uri, const string db_schema)
+{
+   LOG_trace(__func__);
+
+   DbConn db(db_uri, db_schema);
+
+   Survey surv = db.querySurveyAttributes(sid);
+
+   SqlSchema_DELETE cmdDelete;
+
+   vector<string> deleteRowsAllTables = {cmdDelete.getCommand(sid, surv)};
+
+   db.dbExecCmds(deleteRowsAllTables);
+}
+
+
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/dataset_id.cpp b/data-access/engine/src/vlkb-obscore/src/database/dataset_id.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..58d477e55d49d7b42aaa941731d0e21ba4265fed
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/dataset_id.cpp
@@ -0,0 +1,59 @@
+
+#include "dataset_id.hpp"
+
+#include "io.hpp"
+#include "my_assert.hpp"
+
+#include <string>
+
+#include <limits.h> // PATH_MAX
+#include <stdio.h>
+
+using namespace std;
+
+
+
+
+
+
+#define JOINCHAR_HDUNUM (2+2)
+#define PUBDID_LEN (PATH_MAX + NAME_MAX + JOINCHAR_HDUNUM)
+char * create_pubdid (const char * path, const char * filename,
+            unsigned int hdunum, char * pubdid)
+{
+   // FIXME snprintf remove
+   snprintf(pubdid,PUBDID_LEN,"%s_%s_%d",path, filename,hdunum);
+
+   // remove_slash in pubdid string
+   char *p = pubdid;
+   while(*p !='\0')
+   {   
+      if(*p == '/') *p = '-';
+      p++;
+   }   
+
+   return pubdid;
+}
+
+
+
+
+string dataset_id::create(const string path, const string filename, unsigned int hdunum, bool scramble)
+{
+   string pubdid_str;
+   if(scramble)
+   {
+      char pubdid[PUBDID_LEN];
+      create_pubdid (path.c_str(), filename.c_str(), hdunum, pubdid);
+      pubdid_str = string{pubdid};
+   }
+   else
+   {
+      pubdid_str = path + "/" + filename;// FIXME + "[" + to_string(hdunum-1) + "]";
+   }
+   return pubdid_str;
+}
+
+
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/database/dataset_id.hpp b/data-access/engine/src/vlkb-obscore/src/database/dataset_id.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..8b45c8168fa4d33e694992425756950206572c84
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/database/dataset_id.hpp
@@ -0,0 +1,17 @@
+#ifndef DATASET_ID_HPP
+#define DATASET_ID_HPP
+
+#include <string>
+
+// dataset identifier (see: 4.1 Dataset Identifiers)
+// ref: [IVOA Identifiers, Version 2.0, IVOA Recommendation 2016-05-23 ]
+
+namespace dataset_id
+{
+
+std::string create(const std::string path, const std::string filename, unsigned int hdunum, bool scramble);
+
+}
+
+#endif
+
diff --git a/data-access/engine/src/vlkb-obscore/src/main.cpp b/data-access/engine/src/vlkb-obscore/src/main.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..6bfd5c05ff87cdb27d962e637eece3d9e46c809e
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/main.cpp
@@ -0,0 +1,421 @@
+
+#include "database.hpp"
+#include "fitsfiles.hpp"
+#include "config.hpp"
+#include "io.hpp"
+
+#include <iostream>
+#include <sstream>
+
+#include <assert.h>
+
+
+std::ostream& OUT_STREAM = std::cout;
+std::ostream& ERROR_STREAM = std::cerr;
+
+
+using namespace std;
+
+namespace vlkb
+{
+   const int EXIT_WITH_USAGE = -1000;
+   const int EXIT_WITH_ERROR = -2000;
+
+   config conf;
+
+   void usage(const std::string progname)
+   {
+      std::cout 
+         << "Usage: " << progname << " <conf_filename> <command> [cmd-options] [cmd-args]" << endl
+         << "\n where conf_filename is typically /etc/" << progname << "/datasets.conf\n "
+         << "\n and where commands are:\n "
+         << "\n\t dbfiles dbinit dbcheck dblist dbbounds dbadd dbmodgroups dbremove\n"
+         << std::endl
+         << "Version: " << VERSIONSTR << " " << BUILD << std::endl;
+   }
+
+
+
+   // program recognizes following commands
+
+   enum cmd_set {dbinit, dbcheck, dblist, dbfiles, dbbounds, dbaddsurvey, dbmodgroups, dbremovesurvey};
+
+
+   // from bash receive args as strings
+
+   cmd_set to_cmd(const std::string cmdstr)
+   {
+      cmd_set cmd;
+
+      if(cmdstr.compare("dbinit") == 0)       cmd = dbinit;
+      else if(cmdstr.compare("dbcheck") == 0)      cmd = dbcheck;
+      else if(cmdstr.compare("dblist") == 0)       cmd = dblist;
+      else if(cmdstr.compare("dbfiles") == 0)      cmd = dbfiles;
+      else if(cmdstr.compare("dbbounds") == 0)     cmd = dbbounds;
+      else if(cmdstr.compare("dbadd") == 0)        cmd = dbaddsurvey;
+      else if(cmdstr.compare("dbmodgroups") == 0)  cmd = dbmodgroups;
+      else if(cmdstr.compare("dbremove") == 0)     cmd = dbremovesurvey;
+      else
+      {
+         stringstream cmd_err;
+         cmd_err << "String \'" << cmdstr<< "\' not recognized as valid command." << endl;
+         throw(invalid_argument(cmd_err.str()));
+      }
+
+      return cmd;
+   }
+
+} // namespce
+
+
+const bool WITH_PASSWORD = true;
+
+std::string base_name(std::string path)
+{
+   return path.substr(path.find_last_of("//") + 1);
+   // FIXME replace with basename
+}
+
+
+
+//-----------------------------------------------------------
+// all commands handle [argc,argv] and return a code:
+// 0 - Ok: command produces result as described in 'usage'
+// non-zero: Warning, Error: response has other form that 
+// the one described in 'usage'
+//-----------------------------------------------------------
+
+
+int cmd_dbInit(int argc, char * argv[])
+{
+   int rc;
+   try
+   {
+      if(argc!=2)
+      {
+         std::cout << "Usage: dbinit <surveys.csv>" << endl;
+         return vlkb::EXIT_WITH_USAGE;
+      }
+
+      const string csv_filename{argv[1]};
+
+      OUT_STREAM << "DB: " << vlkb::conf.getDbUri() << " schema["<< vlkb::conf.getDbSchema() << "]" << endl;
+      OUT_STREAM << "This will delete all data in database, confirm with 'yes':" << endl;
+      string answer;
+      std::cin >> answer;
+      if(answer.compare("yes") == 0) 
+      {
+         database::dbInit(vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema(), csv_filename);
+         rc = 0;
+      }
+      else
+      {
+         rc = vlkb::EXIT_WITH_ERROR;
+      }
+   }
+   catch(exception& e)
+   {
+      rc = vlkb::EXIT_WITH_ERROR;
+      ERROR_STREAM << e.what() << endl;
+   }
+   return rc;
+}
+
+
+
+int cmd_dbCheck(void)
+{
+   int rc;
+   try
+   {
+      vector<string> db_rows{database::dbCheck(vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema())};
+      for(string one_row : db_rows) cout << one_row << endl;
+      rc = 0;
+   }
+   catch(exception& e)
+   {
+      rc = vlkb::EXIT_WITH_ERROR;
+      ERROR_STREAM << e.what() << endl;
+   }
+   return rc;
+}
+
+
+
+int cmd_dbList(void)
+{
+   int rc;
+   try
+   {
+      vector<string> res_str = database::dbListSurveys(vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema());
+      rc = 0;
+      for(string res_row : res_str)
+         cout << res_row << endl;
+   }
+   catch(exception& e)
+   {
+      rc = vlkb::EXIT_WITH_ERROR;
+      ERROR_STREAM << e.what() << endl;
+   }
+   return rc;
+}
+
+
+
+int cmd_dbBounds(int argc, char * argv[])
+{
+   int rc;
+   switch(argc)
+   {
+      case 2:
+         try
+         {
+            database::dbSurveyBounds(/*stoi(argv[1]),*/ vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema());
+            rc = 0;
+         }
+         catch(exception& e)
+         {
+            rc = vlkb::EXIT_WITH_ERROR;
+            ERROR_STREAM << e.what() << endl;
+         }
+         break;
+
+      default:
+         cout << "Usage: dbbounds <SID>" << endl
+            << "prints min-max bounds" << endl;
+         rc = vlkb::EXIT_WITH_USAGE;
+   }
+   return rc;
+}
+
+
+
+int cmd_dbFiles(int argc, char * argv[])
+{
+   int rc;
+   switch(argc)
+   {
+      case 2:
+         try
+         {
+            string row = database::dbListFiles(std::stoi(argv[1]), vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema(), vlkb::conf.getFitsDir());
+            rc = 0;
+            cout << row << endl;
+         }
+         catch(exception& e)
+         {
+            rc = vlkb::EXIT_WITH_ERROR;
+            ERROR_STREAM << e.what() << endl;
+         }
+         break;
+
+      case 3:
+         try
+         {
+            for(int i=std::stoi(argv[1]); i<=std::stoi(argv[2]); i++)
+               cout << 
+                  database::dbListFiles(i, vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema(), vlkb::conf.getFitsDir())
+                  << endl;
+            rc = 0;
+         }
+         catch(exception& e)
+         {
+            rc = vlkb::EXIT_WITH_ERROR;
+            ERROR_STREAM << e.what() << endl;
+         }
+         break;
+
+      default:
+         cout << "Usage: dbfiles <SID-from> [SID-to]" << endl
+            << "prints file-count for given survey(s)" << endl;
+         rc = vlkb::EXIT_WITH_USAGE;
+   }
+   return rc;
+}
+
+
+
+int cmd_dbAdd(int argc, char * argv[])
+{
+   const int max_hdupos = 1; // FIXME add as param to the command
+   const string groups = ""; // NOTE: uses separate cmd 'dbmodgroups' to add groups to enable access to PRIVATE surveys
+   int sid_from, sid_to;
+   int rc = 0;
+
+   switch(argc)
+   {
+      case 2:
+         sid_from = std::stoi(argv[1]);
+         sid_to   = sid_from;
+         break;
+
+      case 3:
+         sid_from = std::stoi(argv[1]);
+         sid_to   = std::stoi(argv[2]);
+         break;
+
+      default:
+         cout << "Usage: dbadd <SID-from> [SID-to]" << endl
+            << "inserts survey(s) metadata to database" << endl;
+         rc = vlkb::EXIT_WITH_USAGE;
+   }
+
+   OUT_STREAM << "This will add all data into database, (even if already exists!), confirm with 'yes':" << endl;
+   string answer;
+   std::cin >> answer;
+   if(answer.compare("yes") != 0) return vlkb::EXIT_WITH_ERROR; 
+
+   for(int i=sid_from; i<=sid_to; i++)
+   {
+      try
+      {
+         database::dbAddSurvey(i, groups,
+               vlkb::conf.getObsCorePublisher(), vlkb::conf.getObsCoreAccessFormat(), vlkb::conf.getObscoreAccessUrl(),
+               vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema(), vlkb::conf.getFitsDir(), max_hdupos);
+      }
+      catch(exception& e)
+      {
+         rc = vlkb::EXIT_WITH_ERROR;
+         ERROR_STREAM << "SID["<< i << "]: " << e.what() << endl;
+      }
+   }
+
+   return rc;
+}
+
+
+
+int cmd_dbModGroups(int argc, char * argv[])
+{
+   // FIXME groups should be cmd arg and mandatory of policy PRIVATE and empy null if policu PUBLIC
+   // e.g. exit with error if dbAddSurvey gets sid of PRIVATE but no groups arg supplied
+   string groups;
+   int sid_from, sid_to;
+   int rc;
+   switch(argc)
+   {
+      case 3:
+         groups   = std::string(argv[1]);
+         sid_from = std::stoi(argv[2]);
+         sid_to   = sid_from;
+         break;
+
+      case 4:
+         groups   = std::string(argv[1]);
+         sid_from = std::stoi(argv[2]);
+         sid_to   = std::stoi(argv[3]);
+         break;
+
+      default:
+         cout << "Usage: dbmodgroups <groups> <SID-from> [SID-to]" << endl
+            << "modifies list of groups which can access PRIVATE surveys" << endl;
+         rc = vlkb::EXIT_WITH_USAGE;
+   }
+
+   try
+   {
+      for(int i=sid_from; i<=sid_to; i++)
+         database::dbModifyGroups(i, groups,
+               vlkb::conf.getObsCorePublisher(), vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema());
+      rc = 0;
+   }
+   catch(exception& e)
+   {
+      rc = vlkb::EXIT_WITH_ERROR;
+      ERROR_STREAM << e.what() << endl;
+   }
+
+   return rc;
+}
+
+
+
+int cmd_dbRemove(int argc, char * argv[])
+{
+   int rc;
+   switch(argc)
+   {
+      case 2:
+         try
+         {
+            database::dbRemoveSurvey(stoi(argv[1]), vlkb::conf.getDbUri(WITH_PASSWORD), vlkb::conf.getDbSchema());
+            rc = 0;
+         }
+         catch(exception& e)
+         {
+            rc = vlkb::EXIT_WITH_ERROR;
+            ERROR_STREAM << e.what() << endl;
+         }
+         break;
+
+      default:
+         cout << "Usage: dbremove <SID>" << endl
+            << "removes survey metadata from database" << endl;
+         rc = vlkb::EXIT_WITH_USAGE;
+   }
+   return rc;
+}
+
+
+
+//-----------------------------------------------------------
+// main
+//-----------------------------------------------------------
+int main (int argc, char * argv[])
+{
+   const std::string progname = base_name(argv[0]);
+
+   if( argc < 3 )
+   {
+      vlkb::usage(progname);
+      return vlkb::EXIT_WITH_USAGE;
+   }
+
+   const string conf_filename(argv[1]);
+
+   // recognize command from string
+   const vlkb::cmd_set cmd(vlkb::to_cmd(argv[2]));
+
+   // sub-command arguments (skips: vlkb <conf-filename>)
+   int cmd_argc = argc - 2;
+   char ** cmd_argv = &(argv[2]);
+
+   vlkb::conf.read_config(conf_filename);
+/*
+   cout << "LOG DIR: " << vlkb::conf.getLogDir() << endl;
+   cout << "FITSDIR: " << vlkb::conf.getFitsDir() << endl;
+   cout << "DB[" << vlkb::conf.getDbSchema() << "]: " <<vlkb::conf.getDbUri() << endl;
+   cout << endl; 
+*/
+   LOG_open(vlkb::conf.getLogDir(), vlkb::conf.getLogFileName());
+   int rc;
+
+   try
+   {
+      switch(cmd)
+      {
+         case vlkb::dbinit:         rc = cmd_dbInit(cmd_argc,cmd_argv); break;
+         case vlkb::dbcheck:        rc = cmd_dbCheck(); break;
+         case vlkb::dblist:         rc = cmd_dbList(); break;
+         case vlkb::dbbounds:       rc = cmd_dbBounds(cmd_argc,cmd_argv); break;
+         case vlkb::dbfiles:        rc = cmd_dbFiles(cmd_argc,cmd_argv); break;
+         case vlkb::dbaddsurvey:    rc = cmd_dbAdd(cmd_argc,cmd_argv); break;
+         case vlkb::dbmodgroups:    rc = cmd_dbModGroups(cmd_argc,cmd_argv); break;
+         case vlkb::dbremovesurvey: rc = cmd_dbRemove(cmd_argc,cmd_argv); break;
+
+         default: assert(false);
+      }
+   }
+   catch(const invalid_argument& ex)
+   {
+      cout << ex.what() << endl;
+   }
+   catch(const exception& ex)
+   {
+      cerr <<  ex.what() << endl;
+   }
+
+   LOG_close();
+   return rc;
+}
+
diff --git a/data-access/engine/src/vlkb-obscore/src/parse_surveys_csv.cpp b/data-access/engine/src/vlkb-obscore/src/parse_surveys_csv.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..0b3725176d82941e79caca2792062e5586b51006
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/parse_surveys_csv.cpp
@@ -0,0 +1,231 @@
+
+#include "parse_surveys_csv.hpp"
+
+#include "my_assert.hpp"
+#include "io.hpp"
+
+#include <aria-csv-parser/parser.hpp>
+
+#include <limits> // uint max needed
+#include <stdexcept>
+#include <iomanip>
+#include <iostream>
+#include <string>
+#include <vector>
+
+using namespace aria::csv;
+using namespace std;
+
+
+string trim(string& str)
+{
+   // \v(=11) vertical-tab, \f(=12) formfeed
+   // NOTE : current terminal drivers do not honour these but simply translate to chars; however some printer drivers could
+   const string spaces{" \t\n\r\v\f"};
+   size_t first = str.find_first_not_of(spaces);
+   if (first == std::string::npos)
+      return "";
+   size_t last = str.find_last_not_of(spaces);
+   return str.substr(first, (last-first+1));
+}
+
+
+string to_string(dataproduct d)
+{
+   string str;
+   switch(d)
+   {
+      case dataproduct::IMAGE : str = "image"; break;
+      case dataproduct::CUBE  : str = "cube";  break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of dataproduct type");
+   return str;
+}
+
+
+dataproduct to_dataproduct(string raw_string)
+{
+   if(trim(raw_string).compare("image") == 0) return dataproduct::IMAGE;
+   else if(trim(raw_string).compare("cube")  == 0) return dataproduct::CUBE;
+   else throw invalid_argument("'dataproduct' type must be 'image' or 'cube' but was: >" + raw_string + "<"); 
+}
+
+
+
+string to_string(authorization_policy p)
+{
+   string str;
+   switch(p)
+   {
+      case authorization_policy::PRIVATE : str = "PRIVATE"; break;
+      case authorization_policy::PUBLIC  : str = "PUBLIC";  break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of authorization_policy type");
+   return str;
+}
+
+authorization_policy to_authorization_policy(string raw_string)
+{
+   if(trim(raw_string).compare("PRIVATE") == 0) return authorization_policy::PRIVATE;
+   else if(trim(raw_string).compare("PUBLIC")  == 0) return authorization_policy::PUBLIC;
+   else throw invalid_argument("'authorization_policy' type must be 'PRIVATE' or 'PUBLIC' but was: " + raw_string);
+}
+
+
+
+std::string to_string(velocity_unit vu)
+{
+   string str;
+   switch(vu)
+   {
+      case velocity_unit::MPS  : str = "m/s";  break;
+      case velocity_unit::KMPS : str = "km/s"; break;
+      case velocity_unit::NONE : str = "''";   break;
+      case velocity_unit::KPC :  str = "kpc";  break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of velocity_unit type");
+   return str;
+}
+
+velocity_unit to_velocity_unit(std::string raw_string)
+{
+   if(trim(raw_string).compare("m.s**-1") == 0) return velocity_unit::MPS;
+   else if(trim(raw_string).compare("m/s") == 0) return velocity_unit::MPS;
+   else if(trim(raw_string).compare("km.s**-1") == 0) return velocity_unit::KMPS;
+   else if(trim(raw_string).compare("km/s") == 0) return velocity_unit::KMPS;
+   else if(trim(raw_string).empty()) return velocity_unit::NONE;
+   else if(trim(raw_string).compare("kpc") == 0) return velocity_unit::KPC;
+   else throw invalid_argument("'velocity_unit' type must be 'm.s**-1' or 'km.s**-1' but was: " + raw_string); 
+}
+
+
+std::string to_string(calibration calib_level)
+{
+   string str;
+   switch(calib_level)
+   {
+      case calibration::LEVEL0: str = "LEVEL0"; break;
+      case calibration::LEVEL1: str = "LEVEL1"; break;
+      case calibration::LEVEL2: str = "LEVEL2"; break;
+      case calibration::LEVEL3: str = "LEVEL3"; break;
+      case calibration::LEVEL4: str = "LEVEL4"; break;
+   }
+   my_assert(!str.empty(), __FILE__,__LINE__, "unrecognized value of calibration type in " + string{__func__});
+   return str;
+}
+
+unsigned int to_uint(calibration calib_level)
+{
+   const unsigned int default_value = std::numeric_limits<unsigned int>::max();
+   unsigned int val = default_value;
+   switch(calib_level)
+   {
+      case calibration::LEVEL0: val = 0; break;
+      case calibration::LEVEL1: val = 1; break;
+      case calibration::LEVEL2: val = 2; break;
+      case calibration::LEVEL3: val = 3; break;
+      case calibration::LEVEL4: val = 4; break;
+   }
+   my_assert(val != default_value, __FILE__,__LINE__, "unrecognized value of calibration type in " + string{__func__});
+   return val;
+}
+
+
+calibration to_calibration(std::string raw_string)
+{
+   size_t pos{};
+   unsigned int level = stoi(raw_string, &pos);
+
+   switch(level)
+   {
+      case 0: return calibration::LEVEL0; break;
+      case 1: return calibration::LEVEL1; break;
+      case 2: return calibration::LEVEL2; break;
+      case 3: return calibration::LEVEL3; break;
+      case 4: return calibration::LEVEL4; break;
+      default:
+              throw invalid_argument("'calibration' value is " + to_string(level) + " however valid values are 0,1,2,3 and 4");
+   }
+}
+
+
+
+
+
+std::ostream& operator<<( std::ostream &out, struct survey const& p)
+{
+   out << setw(3) << right << p.survey_id 
+      << ": " << setw(5) << to_string(p.dataproduct_type)
+      << " " << setw(7) << to_string(p.auth_policy)
+      << setw(13) << right << p.rest_frequency_Hz;
+   out << " " << setw(50) << left << p.storage_path + "/" + p.file_filter; 
+   out << " [" << p.name << ", " << p.species << ", " << p.transition << "]" 
+      << "   " << p.description
+      << "   " << p.o_ucd;	       
+
+   return out;
+}
+
+
+string to_stringN(string raw_string, string::size_type n)
+{
+   if(raw_string.length() > n) throw invalid_argument(raw_string + " is too long. Should be at most : " + to_string(n));
+   else return raw_string;
+}
+
+
+vector<survey> parse_surveys(string filename, bool skip_first_row)
+{
+   LOG_trace(__func__);
+
+   std::ifstream f(filename);
+   CsvParser parser(f);
+
+   vector<survey> surveys;
+
+   struct survey surv;
+
+   int ixr = 0;
+   for (auto& row : parser)
+   {
+      if(skip_first_row && (ixr++ == 0)) continue; // FIXME do before enterning cycle
+
+      int ixf = 0;
+      for (auto& field : row)
+      {
+         switch(ixf++)
+         {
+            case 0: surv.survey_id = atoi(field.c_str()); break;
+                    // subsurvey id
+            case 1: surv.name       = field; break;
+            case 2: surv.species    = field; break;
+            case 3: surv.transition = field; break;
+                    // vlkb-internal extra-data
+            case 4: surv.rest_frequency_Hz = stoul(field); break;
+            case 5: break; // NOT USED surv.restf_fits_unit     = to_stringN(field, 20); break;
+            case 6: surv.velocity_fits_unit  = to_velocity_unit(field); break;
+                    // vlkb-internal did-resolution
+            case 7: surv.storage_path  = field; break;
+            case 8: surv.file_filter   = field; break;
+                    //
+            case 9: surv.description = field; break;
+                    // obscore related
+            case 10: surv.dataproduct_type         = to_dataproduct(field); break;
+            case 11: surv.calib_level              = to_calibration(field); break;
+            case 12: surv.o_ucd                    = field; break;
+            case 13: surv.fitskey_facility_name    = to_stringN(field, 8); break;
+            case 14: surv.fitskey_instrument_name  = to_stringN(field, 8); break;
+                     // security
+            case 15: surv.auth_policy = to_authorization_policy(field); break;
+            default:
+                     my_assert(false, __FILE__,__LINE__, "too many fileds in a row of surveys metadata file: " + filename);
+         }
+      }
+      surveys.push_back(surv);
+   }
+
+   return surveys;
+}
+
+
+
diff --git a/data-access/engine/src/vlkb-obscore/src/parse_surveys_csv.hpp b/data-access/engine/src/vlkb-obscore/src/parse_surveys_csv.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..c03fec65cc5ab0f32b92e90e76ba2123884d5541
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/src/parse_surveys_csv.hpp
@@ -0,0 +1,49 @@
+#ifndef PARSE_SURVEYS_CSV_HPP
+#define PARSE_SURVEYS_CSV_HPP
+
+#include <string>
+#include <vector>
+
+enum class dataproduct {IMAGE, CUBE};
+std::string to_string(dataproduct d);
+dataproduct to_dataproduct(std::string raw_string);
+
+enum class authorization_policy {PRIVATE, PUBLIC};
+std::string to_string(authorization_policy p);
+authorization_policy to_authorization_policy(std::string raw_string);
+
+enum class velocity_unit {MPS, KMPS, KPC, NONE};
+std::string to_string(velocity_unit);
+velocity_unit to_velocity_unit(std::string raw_string);
+
+enum class calibration {LEVEL0, LEVEL1, LEVEL2, LEVEL3, LEVEL4};
+std::string to_string(calibration calib_level);
+unsigned int to_uint(calibration calib_level);
+calibration to_calibration(std::string raw_string);
+
+
+struct survey
+{
+   unsigned int survey_id;
+   std::string name;
+   std::string species;
+   std::string transition;
+   unsigned long rest_frequency_Hz;
+//   std::string restf_fits_unit;    // NOT used
+   velocity_unit velocity_fits_unit;
+   std::string storage_path;
+   std::string file_filter;
+   std::string description;   // convert to UTF-8
+   dataproduct dataproduct_type;
+   calibration calib_level;
+   std::string o_ucd;
+   std::string fitskey_facility_name;   // length < 8
+   std::string fitskey_instrument_name; // length < 8
+   authorization_policy auth_policy;
+};
+
+std::ostream& operator<<( std::ostream &out, struct survey const& p);
+
+std::vector<survey> parse_surveys(std::string filename, bool skip_first_row);
+
+#endif
diff --git a/data-access/engine/src/vlkb-obscore/vlkb-obscore.1 b/data-access/engine/src/vlkb-obscore/vlkb-obscore.1
new file mode 100644
index 0000000000000000000000000000000000000000..3ed9ccc97f73fe9c6f8c424d0b2fdcca941a1118
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/vlkb-obscore.1
@@ -0,0 +1,23 @@
+.\"                                      Hey, EMACS: -*- nroff -*-
+.\" (C) Copyright 2023 ...
+.\"
+.TH vlkb-obscore 1 
+.SH NAME
+vlkb-obscore \- vlkb-obscore application
+.SH SYNOPSIS
+.B vlkb-oscore 
+.SH DESCRIPTION
+The 
+.B vlkb-obscore 
+application generates ObsCore table by recommendation of Virtual Observatory (http://ivoa.net/documents). It requires
+a sub-directory with FITS-files and access to a database which allows create and update tables. Programm prints detailed
+help on sub-commands.
+.SH SEE ALSO
+.BR vlkb-obscore (1).
+.SH AUTHORS
+The
+.B vlkb-obscore 
+was written by 
+RBu <rbu@ia2.inaf.it>
+.PP
+This document was written by RBu <rbu@ia2.inaf.it> for Debian.
diff --git a/data-access/engine/src/vlkb-obscore/vlkb-obscore.changelog.Debian b/data-access/engine/src/vlkb-obscore/vlkb-obscore.changelog.Debian
new file mode 100644
index 0000000000000000000000000000000000000000..502981f01efb819d3e9cf50a86283653fb0ba167
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/vlkb-obscore.changelog.Debian
@@ -0,0 +1,8 @@
+vlkb-obscore (1.4.8) stable; urgency=low
+
+  [ VLKB ]
+  * First release via deb and rpm packages.
+
+ -- INAF <ia2@inaf.com>  Thu,  23 Dec 2023 11:30:00 +0100 
+
+
diff --git a/data-access/engine/src/vlkb-obscore/vlkb-obscore.control b/data-access/engine/src/vlkb-obscore/vlkb-obscore.control
new file mode 100644
index 0000000000000000000000000000000000000000..4841bd80e8c1dbc53238a137ee3dff53ed7229e7
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/vlkb-obscore.control
@@ -0,0 +1,8 @@
+Package: vlkb-obscore
+Version:
+Section: utils
+Priority: optional
+Architecture: all
+Maintainer: VLKB <ia2@vlkb.org>
+Description: This is vlkb-obscore cli to create VO ObsCore table in a DB from FITS-files given ina dir.
+
diff --git a/data-access/engine/src/vlkb-obscore/vlkb-obscore.copyright b/data-access/engine/src/vlkb-obscore/vlkb-obscore.copyright
new file mode 100644
index 0000000000000000000000000000000000000000..0cd64beebc70e69d1eb6f9d6ddc69b8f063999ac
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/vlkb-obscore.copyright
@@ -0,0 +1,14 @@
+vlkb-obscore
+
+Copyright: 2023 INAF <ia2@inaf.com>
+
+2023-10-30
+
+The entire code base may be distributed under the terms of the GNU General
+Public License (GPL), which appears immediately below.  Alternatively, all
+of the source code as any code derived from that code may instead be
+distributed under the GNU Lesser General Public License (LGPL), at the
+choice of the distributor. The complete text of the LGPL appears at the
+bottom of this file.
+
+See /usr/share/common-licenses/(GPL|LGPL)
diff --git a/data-access/engine/src/vlkb-obscore/vlkb-obscore.datasets.conf b/data-access/engine/src/vlkb-obscore/vlkb-obscore.datasets.conf
new file mode 100644
index 0000000000000000000000000000000000000000..9d710d89728767706ccea944d9e5e1a06d6dd953
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/vlkb-obscore.datasets.conf
@@ -0,0 +1,19 @@
+
+# root of path for local access
+# fits_path_surveys=/srv/surveys
+
+# DB connection (see PostgreSQL manual 'Connection URIs'
+pg_uri=
+pg_schema=
+
+# obs_publisher_did = <obscore publisher> ? <generated-pubdid>
+# obscore_publisher=
+
+# obs_access_url: <obscore_access_url>/<storage-path>/<file-name>
+obscore_access_url=
+# obscore_access_format=application/fits
+
+# logging (holds last exec only)
+# log_dir=/tmp
+# log_filename=vlkb-obscore.log
+
diff --git a/data-access/engine/src/vlkb-obscore/vlkb-obscore.spec b/data-access/engine/src/vlkb-obscore/vlkb-obscore.spec
new file mode 100644
index 0000000000000000000000000000000000000000..de2f7fa7a79b2a4443fd02e4819b33eb39414cd8
--- /dev/null
+++ b/data-access/engine/src/vlkb-obscore/vlkb-obscore.spec
@@ -0,0 +1,33 @@
+Name: vlkb-obscore
+Version: %{version}
+Release: 1%{?dist}
+Summary: vlkb-obscore
+Source1: vlkb-obscore
+License: GPLv3+
+URL: http://ia2.inaf.it
+BuildRequires: gcc >= 3.2.0, glibc-devel >= 2.17, libstdc++-devel >= 4.8, ast-devel >= 7.3.4, libpqxx-devel >= 4.0 cfitsio-devel >= 3.370, libcsv-devel >= 3.0, librabbitmq-devel >= 0.8.0
+Requires: glibc >= 2.17, libstdc++ >= 4.8, ast >= 7.3.4, libpqxx >= 4.0, cfitsio >= 3.370, libcsv >= 3.0, librabbitmq >= 0.8.0 
+
+%description
+Utility from ViaLactea Knowledge Base project to ingest VO-ObsCore table into a DB from a set of FITS-files provided in a given dir.
+
+
+%prep
+
+%build
+
+
+%install
+mkdir -p %{buildroot}%{_prefix}/bin
+install -m 755 %{SOURCE1} %{buildroot}%{_prefix}/bin
+%files
+%{_bindir}/vlkb-obscore
+
+
+%post
+
+%postun
+
+
+%changelog
+                
diff --git a/data-access/engine/src/vlkb/Makefile b/data-access/engine/src/vlkb/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..5b954a816bff911ecf3ca83efd8ad9e84ba274c9
--- /dev/null
+++ b/data-access/engine/src/vlkb/Makefile
@@ -0,0 +1,177 @@
+#================================================================================
+EXEC_NAME=vlkb
+INST_NAME=test
+DEBUG_LEV=-v1
+INSTALL_DIR=/usr/local
+VERSION ?= $(shell git describe)
+TAR_NAME := `basename $(PWD)`
+#================================================================================
+DEPS_DIR := ../common ../../ext/aria-csv ../../ext/nlohmann-json
+DEPS_INC := $(foreach d, $(DEPS_DIR), $d/include)
+DEPS_LIB := $(foreach d, $(DEPS_DIR), $d/lib)
+#================================================================================
+INC_DIR = $(DEPS_INC) \
+	  /usr/include/cfitsio \
+	  /usr/local/cfitsio/include
+LIB_DIR = $(DEPS_LIB) \
+	  /usr/lib64/ast \
+	  /usr/local/lib \
+	  /usr/local/cfitsio/lib
+#================================================================================
+CC=g++
+CFLAGS_DEBUG   = -g -DFDB_DEBUG
+CFLAGS_RELEASE = -O2 
+FLAGS_COMMON   = -fPIC -Wall -Wextra -Wconversion -fno-common -pthread -DVERSIONSTR='"$(VERSION)"' \
+-DBUILD='"$(shell LANG=us_US date; hostname)"'
+CFLAGS_COMMON   = -c -Wstrict-prototypes $(FLAGS_COMMON)
+CXX_DEBUG_FLAGS   = -g -DVERBOSE_DEBUG -DFDB_DEBUG
+CXX_RELEASE_FLAGS = -O3
+CXX_DEFAULT_FLAGS = -c -std=c++11 $(FLAGS_COMMON)
+LDFLAGS = -Wall -lvlkbcommon -lcfitsio -lcsv -last -last_grf_2.0 -last_grf_3.2 -last_grf_5.6 -last_grf3d -last_err -lstdc++ -lm  
+INC_PARM=$(foreach d, $(INC_DIR), -I$d)
+LIB_PARM=$(foreach d, $(LIB_DIR), -L$d)
+#================================================================================
+EXT_DIR = ext
+SRC_DIR = src
+OBJ_DIR = obj
+BIN_DIR = bin
+INC_PARM += -I$(EXT_DIR)/include -I$(SRC_DIR)
+LIB_PARM += -L$(EXT_DIR)/lib
+#================================================================================
+EXECUTABLE = $(BIN_DIR)/$(EXEC_NAME)
+CPP_FILES  = $(wildcard $(SRC_DIR)/*.cpp)
+OBJ_FILES  = $(addprefix $(OBJ_DIR)/,$(notdir $(CPP_FILES:.cpp=.o))) 
+#================================================================================
+NPROCS = $(shell grep -c 'processor' /proc/cpuinfo)
+MAKEFLAGS += -j$(NPROCS)
+#================================================================================
+.PHONY: all debug release clean
+
+all: debug 
+
+.PHONY: run 
+run: debug
+	$(EXECUTABLE) $(INST_NAME) $(DEBUG_LEV)
+
+release: CFLAGS   += $(CFLAGS_RELEASE) $(CFLAGS_COMMON)
+release: CXXFLAGS += $(CXX_RELEASE_FLAGS) $(CXX_DEFAULT_FLAGS)
+release: $(EXECUTABLE)
+
+debug: CFLAGS   += $(CFLAGS_DEBUG) $(CFLAGS_COMMON)
+debug: CXXFLAGS += $(CXX_DEBUG_FLAGS) $(CXX_DEFAULT_FLAGS)
+debug: $(EXECUTABLE)
+
+$(EXECUTABLE): makedir $(OBJ_FILES) $(DB_OBJ_FILES)
+	$(CC) $(OBJ_FILES) $(LIB_PARM) $(LDFLAGS) -o $@
+
+$(OBJ_DIR)/%.o: $(SRC_DIR)/%.cpp
+	$(CC) $(CXXFLAGS) $(INC_PARM) -o $@ $<
+
+clean:
+	-rm -rf $(OBJ_DIR) $(BIN_DIR) $(EXT_DIR)
+
+.PHONY: echo
+echo:
+	@echo EXECUTABLE:
+	@echo $(EXECUTABLE)
+	@echo CPP FILES:
+	@echo $(CPP_FILES)
+	@echo OBJ_FILES:
+	@echo $(OBJ_FILES)
+	@echo DB_OBJ_FILES:
+	@echo $(DB_OBJ_FILES)
+	@echo INC_PARM
+	@echo $(INC_PARM)
+	@echo LIB_PARM
+	@echo $(LIB_PARM)
+	@echo installedEXE
+	@echo $(INSTALL_DIR)/$(EXEC_NAME)$(SUFFIX)
+
+
+gdb:
+	gdb $(BIN_DIR)/$(EXEC_NAME) -x vlkb.gdb
+
+# release tar.gz
+
+.PHONY: $(DEPS_DIR)
+$(DEPS_DIR):
+	make -C $@ $(DEPS_TARGET)
+
+.PHONY: deps
+deps : $(DEPS_DIR)
+	mkdir -p $(EXT_DIR)
+	cp -r $(DEPS_INC) $(EXT_DIR)
+	cp -r $(DEPS_LIB) $(EXT_DIR)
+
+.PHONY: deps-clean
+deps-clean : DEPS_TARGET=clean
+deps-clean : $(DEPS_DIR)
+
+.PHONY: makedir
+makedir:
+	-mkdir -p $(OBJ_DIR) $(OBJ_DIR)/database $(BIN_DIR)
+
+.PHONY: tar
+tar: deps
+	-tar -czvf $(TAR_NAME)-$(VERSION).tar.gz --transform="s|^|$(TAR_NAME)-$(VERSION)/|" $(PROTO_DIR) $(SRC_DIR) $(EXT_DIR) Makefile
+
+
+
+# release rpm deb
+
+.PHONY: rpm
+rpm: RPM_ROOT=rpmbuild
+rpm: release
+	mkdir -p $(RPM_ROOT)/{BUILD,BUILDROOT,RPMS,SOURCES,SPECS,SRPMS}
+	cp vlkb.spec $(RPM_ROOT)/SPECS
+	cp bin/vlkb $(RPM_ROOT)/SOURCES
+	rpmbuild -bb --define "_topdir `pwd`/$(RPM_ROOT)"  --define "_prefix /usr/local" --define "version $(shell git describe | sed -r 's/-/./g')"  vlkb.spec
+	find $(RPM_ROOT)/RPMS/* -name '*.rpm' -print0 | xargs -0 cp -t .
+	rm -fr $(RPM_ROOT)
+
+
+.PHONY: deb
+deb: DEB_ROOT=debbuild
+deb: PREFIX=$(DEB_ROOT)/vlkb/usr/local
+deb:
+	mkdir -p $(DEB_ROOT)/vlkb/DEBIAN $(PREFIX)
+	mkdir -p $(PREFIX)/bin $(PREFIX)/etc/vlkb
+	mkdir -p $(PREFIX)/share/doc/vlkb
+	mkdir -p $(PREFIX)/share/man/man1
+	sed 's/Version:.*/Version: $(VERSION)/' vlkb.control > $(DEB_ROOT)/vlkb/DEBIAN/control
+	echo "/usr/local/etc/vlkb/datasets.conf" > $(DEB_ROOT)/vlkb/DEBIAN/conffiles
+	cp bin/vlkb $(PREFIX)/bin
+	cp vlkb.datasets.conf $(PREFIX)/etc/vlkb/datasets.conf
+	cp vlkb.changelog.Debian $(PREFIX)/share/doc/vlkb/changelog.Debian
+	cp vlkb.copyright $(PREFIX)/share/doc/vlkb/copyright
+	cp vlkb.1 $(PREFIX)/share/man/man1/vlkb.1
+	gzip --best -n $(PREFIX)/share/man/man1/vlkb.1
+	gzip --best -n $(PREFIX)/share/doc/vlkb/changelog.Debian
+	cd $(DEB_ROOT) && dpkg-deb --root-owner-group --build vlkb && mv vlkb.deb ../vlkb_$(VERSION).deb && cd -
+	rm -fr $(DEB_ROOT)
+
+
+# gitlab Packages doc: https://docs.gitlab.com/ee/user/packages/generic_packages/
+# make up/download PACK_EXT = rpm | deb
+.PHONY: upload
+upload: PACK_FILE := $(shell ls -t $(EXEC_NAME)*.$(PACK_EXT) | head -1)
+upload: GITLAB_PROJ_ID := 79
+upload: GITLAB_PROJ_NAME := $(shell basename -s .git `git config --get remote.origin.url`)
+upload: VER_MAJOR := $(shell echo $(VERSION) | cut -f1 -d.)
+upload: VER_MINOR := $(shell echo $(VERSION) | cut -f2 -d.)
+upload: PACK_URL := "https://ict.inaf.it/gitlab/api/v4/projects/$(GITLAB_PROJ_ID)/packages/generic/$(GITLAB_PROJ_NAME)/$(VER_MAJOR).$(VER_MINOR)/$(PACK_FILE)"
+upload:
+	curl --header "PRIVATE-TOKEN: glpat-CJZDcks7bYqE__ePn4J6" --upload-file $(PACK_FILE) $(PACK_URL)
+
+
+.PHONY: download
+#download: PACK_FILE := $(EXEC_NAME)-$(shell echo $(VERSION) | sed -r "s/-/./g ")-1.x86_64.rpm
+download: PACK_FILE := $(EXEC_NAME)_$(VERSION).deb
+download: GITLAB_PROJ_ID := 79
+download: GITLAB_PROJ_NAME := $(shell basename -s .git `git config --get remote.origin.url`)
+download: VER_MAJOR := $(shell echo $(VERSION) | cut -f1 -d.)
+download: VER_MINOR := $(shell echo $(VERSION) | cut -f2 -d.)
+download: PACK_URL := "https://ict.inaf.it/gitlab/api/v4/projects/$(GITLAB_PROJ_ID)/packages/generic/$(GITLAB_PROJ_NAME)/$(VER_MAJOR).$(VER_MINOR)/$(PACK_FILE)"
+download:
+	curl -O --header "PRIVATE-TOKEN: glpat-CJZDcks7bYqE__ePn4J6" $(PACK_URL)
+
diff --git a/data-access/engine/src/vlkb/src/ast.cpp b/data-access/engine/src/vlkb/src/ast.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..f13650f587392571c8e48818c1fbec37ae1f648d
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/ast.cpp
@@ -0,0 +1,145 @@
+
+#include "ast.hpp"
+
+#include "ast4vl.hpp" // cout operator needed
+#include "cutout.hpp" // coordinates needed
+#include "cutout_ostream.hpp"
+#include "service_string.hpp"
+#include "fitsfiles.hpp" // header-string needed
+#include "json_request.hpp" // uses json for vlkb-overlap region-string
+
+#include "io.hpp"
+#include "my_assert.hpp"
+
+#include <iostream>
+#include <climits> // INT_MAX needed
+#include <sstream>
+#include <vector>
+
+
+using namespace std;
+
+
+
+
+//---------------------------------------------------------------------
+// vertices
+//---------------------------------------------------------------------
+
+int vlkb_skyvertices(const string& pathname, const string& skysys_str)
+{
+   LOG_trace(__func__);
+
+   int maxHdu = 1;// FIXMEINT_MAX; // read all HDU's
+
+   std::vector<fitsfiles::Hdu> allHdus = 
+      fitsfiles::fname2hdrstr(pathname, maxHdu);
+
+   for(unsigned int i=0; i<allHdus.size(); i++)
+   {
+      cout << "HDU#" << i << endl;
+
+      fitsfiles::Hdu hd = allHdus.at(i);
+
+      vector<point2d> vertices = calc_skyvertices(hd.m_header, skysys_str);
+
+      for(point2d vertex : vertices) cout << " " << vertex << endl;
+   }
+
+   return 0;
+}
+
+
+
+//---------------------------------------------------------------------
+// bounds
+//---------------------------------------------------------------------
+/*
+const string VELOLSRK{"System=VELO,StdOfRest=LSRK,Unit=km/s"};
+const string WAVEBARY{"System=WAVE,StdOfRest=Bary,Unit=m"};
+*/
+int vlkb_listbounds(const string& skysys_str, const string& specsys_str, const string& pathname)
+{
+   LOG_trace(__func__);
+
+   int maxHdu = 1;//FIXME INT_MAX; // read all HDU's
+
+   std::vector<fitsfiles::Hdu> allHdus = 
+      fitsfiles::fname2hdrstr(pathname, maxHdu);
+
+   for(unsigned int i=0; i<allHdus.size(); i++)
+   {
+      cout << "HDU#" << i << endl;
+
+      fitsfiles::Hdu hd = allHdus.at(i);
+
+      vector<Bounds> bounds_vec = calc_bounds(hd.m_header, skysys_str, specsys_str);
+
+      for(Bounds bnds : bounds_vec) cout << bnds << endl;
+   }
+
+   return 0;
+}
+
+
+
+
+
+
+
+
+
+//---------------------------------------------------------------------
+// overlap with area given in query-string form (name=value&...)
+//---------------------------------------------------------------------
+
+// parse query string to service::coordinates
+
+vector<string> split (const string &s, char delim)
+{
+   vector<string> result;
+   stringstream ss (s);
+   string item;
+
+   while (getline (ss, item, delim)) {
+      result.push_back (item);
+   }
+
+   return result;
+}
+
+
+coordinates parse_coordinates(const string region_string)
+{
+   LOG_trace(string(__func__) + " : " + region_string);
+
+   json_request req(region_string);
+   coordinates coord = to_coordinates(req.get_pos(), req.get_band(), req.get_time(), req.get_pol());
+
+   LOG_STREAM << "coord parsed: " << coord << endl;
+
+   return coord;
+}
+
+
+int vlkb_overlap(const string& pathname, const string& region, vector<uint_bounds>& bnds) 
+{
+   LOG_trace(__func__);
+
+   int maxHdu = 1;// INT_MAX; FIXME fitsfiles::header throws error reading behind end-of-file due to INT_MAX 
+
+   std::vector<fitsfiles::Hdu> allHdus = 
+      fitsfiles::fname2hdrstr(pathname, maxHdu);
+
+   const coordinates coord = parse_coordinates(region.c_str());
+
+   int ov_code;
+   for(unsigned int i=0; i<allHdus.size(); i++)
+   {
+      fitsfiles::Hdu hd = allHdus.at(i);
+      bnds = calc_overlap(hd.m_header, coord, ov_code);
+   }
+   return ov_code;
+}
+
+
diff --git a/data-access/engine/src/vlkb/src/ast.hpp b/data-access/engine/src/vlkb/src/ast.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..0ba4b902613b2e894f5c3a345791de4bee913091
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/ast.hpp
@@ -0,0 +1,13 @@
+#ifndef AST_HPP
+#define AST_HPP
+
+#include "ast4vl.hpp" // uint_bounds needed
+
+#include <string>
+#include <vector>
+
+int vlkb_skyvertices(const std::string& pathname, const std::string& skysys_str);
+int vlkb_listbounds(const std::string& skysys_str, const std::string& specsys_str, const std::string& pathname);
+int vlkb_overlap(const std::string& pathname, const std::string& region, std::vector<uint_bounds>& bnds);
+
+#endif
diff --git a/data-access/engine/src/vlkb/src/dropdegen.cpp b/data-access/engine/src/vlkb/src/dropdegen.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..8eb1e2ecb7ab3c5173b287449421cf7147db7038
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/dropdegen.cpp
@@ -0,0 +1,153 @@
+
+#include "dropdegen.hpp"
+
+#include "fitsfiles.hpp"
+#include "io.hpp"
+
+#include <fitsio.h>
+
+#include <string.h>
+#include <stdio.h>
+
+using namespace std;
+
+void dropdegen(fitsfile *fptr)
+{
+    LOG_trace(__func__);
+
+    FILE* lferr = stderr;
+
+    int status = 0;
+
+    // DBG print key statistics Before eventual modifications
+    int keysexist = 0;
+    int morekeys  = 0;
+    if(fits_get_hdrspace(fptr,&keysexist,&morekeys,&status))
+        fits_report_error(lferr, status);
+//    else
+        // FIXME fprintf(lferr,"BEFORE keysexist:%d morekeys:%d\n",keysexist,morekeys);
+
+    int naxis = 0;
+    if(fits_read_key(fptr,TINT,"NAXIS",&naxis,NULL,&status))
+        fits_report_error(lferr, status);
+//    else
+        //fprintf(lfout, "%s: NAXIS  %d\n",__func__,naxis);
+
+    // Note: string lengths:
+    // char keyname[FLEN_KEYWORD], colname[FLEN_VALUE], coltype[FLEN_VALUE];
+
+    LOG_STREAM << "NAXIS: " << to_string(naxis) << endl;
+
+    int i;
+    int orignaxis = naxis;
+    for (i=0; (i<orignaxis) && (!status); i++) {
+
+        int axislen = 0;
+        char key[FLEN_KEYWORD] = {"\0"};
+        int kix = i+1; // keyindex (starts from 1)
+        sprintf(key,"NAXIS%d",kix);
+        // FIXME fprintf(lfout,"%s: start for %s\n",__func__,key);
+        if(fits_read_key(fptr,TINT,key,&axislen,NULL,&status))
+            fits_report_error(lferr, status);
+//        else
+            // FIXME fprintf(lfout, "%s: %s %d\n",__func__,key,axislen);
+
+        if(axislen != 1)
+            continue;
+
+        // found degen axis NAXISi = 1 -> remove it
+        // FIXME fprintf(lfout,"%s: degen axis %s\n",__func__,key);
+
+        // adjust NAXIS
+        int newvalue = --naxis;
+        // FIXME fprintf(lfout,"%s: new value NAXIS %d\n",__func__,newvalue);
+        if(fits_update_key(fptr,TINT,"NAXIS",&newvalue,NULL,&status))
+            fits_report_error(lferr, status);
+
+        // delete NAXISi ...
+        // FIXME fprintf(lfout,"%s: delete key %s\n",__func__,key);
+        if(fits_delete_key(fptr,key,&status))
+            fits_report_error(lferr, status);
+
+        // ... and all keys which end with kix and are 5 chars: CTYPE CRVAL CRPIX ...
+        // FIXME is this correct ? Other keys longer then 5 chars and
+        // alternative encodings (one letter after axis number)??
+        // How to define what keys to remove ?
+        char keys[FLEN_KEYWORD] = {"\0"};
+        sprintf(keys,"?????%d",kix);
+        // FIXME fprintf(lfout,"%s: delete keys %s ",__func__,keys);
+        while(fits_delete_key(fptr,keys,&status) != KEY_NO_EXIST){
+            ;// FIXME fprintf(lfout,".");
+        }
+        // FIXME fprintf(lfout,"\n");
+        if(status==KEY_NO_EXIST){
+            status = 0; // Reset after expected error in while()
+        } else {
+            fits_report_error(lferr, status);
+        }
+    }
+
+    // DBG print key statistics After eventual modifications
+    if(fits_get_hdrspace(fptr,&keysexist,&morekeys,&status))
+        fits_report_error(lferr, status);
+//    else
+        // FIXME fprintf(lferr,"AFTER  keysexist:%d morekeys:%d\n",keysexist,morekeys);
+}
+
+
+
+
+/*
+ * Filename is <somename>.fits -> try to read it as fits file.
+ */
+int vlkb_dropdegen(const char * fitsfname)
+{
+	LOG_trace(__func__);
+
+	fitsfile *fptr;
+	int status=0;
+	int rc=0;
+	int hdupos;
+
+	int iomode = READWRITE;
+
+	LOG_STREAM << "fits_open_file" << endl;
+
+	if (fits_open_file(&fptr, fitsfname, iomode, &status))
+	{
+		LOG_STREAM << "fitsfname: " << fitsfname << endl;
+		LOG_STREAM << fitsfiles::cfitsio_errmsg(__FILE__,__LINE__,status) << endl;
+		rc = 0;
+		goto f_end;
+	}
+	fits_get_hdu_num(fptr, &hdupos);  /* Get the current HDU position */
+
+	LOG_STREAM << "hdupos: " << to_string(hdupos) << endl;
+
+	for (; !status; hdupos++)  /* Main loop through each HDU/extension */
+	{
+		LOG_STREAM << "hdupos: " << to_string(hdupos) << endl;
+
+		// drop degenerate axis in current HDU
+		dropdegen(fptr);
+
+		// take next HDU
+		fits_movrel_hdu(fptr, 1, NULL, &status);  /* try to move to next HDU */
+	}
+
+	if (status == END_OF_FILE){
+		status = 0; // Reset after expected error in for
+	} else {
+		rc = -1;
+	}
+
+	fits_close_file(fptr, &status);
+
+f_end:
+	//printf("%s: rc = %d\n",__func__,rc);
+	return rc;
+}
+
+
+
+
diff --git a/data-access/engine/src/vlkb/src/dropdegen.hpp b/data-access/engine/src/vlkb/src/dropdegen.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..4b5162006b6a6a443101ee8e6536fb30749db96b
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/dropdegen.hpp
@@ -0,0 +1,19 @@
+#ifndef DROPDEGEN_HPP
+#define DROPDEGEN_HPP
+
+//#include <fitsio.h>
+
+//#include <stdio.h>
+
+// Drop all degenerated axes from a header of a FITS file.
+// Degenerate axis is such that NAXISi = 1
+// dropdegen() will adjust NAXIS keyword, remove NAXISi keyword(s)
+// and remove all keywords related to the degenerated axis 'i'.
+
+// drop degen in an opened file fptr
+//void dropdegen(fitsfile *fptr);
+
+// drop degen in all HDU of a fits file given by fitsname
+int vlkb_dropdegen(const char * fitsfname);
+
+#endif
diff --git a/data-access/engine/src/vlkb/src/imcopy.cpp b/data-access/engine/src/vlkb/src/imcopy.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..51ce0157300e60d94cb70c6beee287d8dbd32253
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/imcopy.cpp
@@ -0,0 +1,462 @@
+#include <algorithm> // remove_if
+#include <string.h> 
+#include <stdlib.h>
+#include <math.h> 
+#include <ctype.h>
+#include <errno.h>
+#include <stddef.h>  /* apparently needed to define size_t */
+#include <stdexcept>
+#include "fitsio2.h"
+//#include "group.h"
+
+#include "service_string.hpp" // to_cftsio_format() needed$
+#include "ast.hpp"
+#include "fitsfiles.hpp"
+#include "imcopy.hpp"
+#include "io.hpp"
+
+#include "fitsio.h"
+
+using namespace std;
+
+
+
+int stream_cutout(string pathname, int extnum, string region)
+{
+   vector<uint_bounds> bnds;
+
+   int ov_code = vlkb_overlap(pathname, region, bnds);
+
+   if((ov_code >= 2) && (ov_code <= 5))
+   {
+      string pixfilter{ to_cfitsio_format(bnds) };
+
+      int rc = imcopy(pathname, extnum, pixfilter, "dummy");
+      if(rc)
+         std::cout << "rc = " << rc << std::endl;
+      return EXIT_SUCCESS;
+
+   }
+   else if((ov_code == 1) || (ov_code == 6))
+   {
+      // no overlap
+      return EXIT_SUCCESS;
+   }
+   else
+   {
+      throw runtime_error("overlap code invalid: " + to_string(ov_code));
+   }
+}
+
+
+
+// ----------------------------------------------------------------------------------
+
+int fits_copy_image_section2(
+      fitsfile *fptr,  /* I - pointer to input image */
+      fitsfile *newptr,  /* I - pointer to output image */
+      char *expr,       /* I - Image section expression    */
+      int *status);
+
+
+
+
+int imcopy(std::string filename, int extnum, std::string pixfilter, std::string temp_root)
+{
+   LOG_trace(__func__);
+
+   LOG_STREAM << filename << " EXT: " << extnum << endl;
+   LOG_STREAM << "pixfilter: " << pixfilter << endl;
+   LOG_STREAM << "temp_root: " << temp_root << endl;
+
+   int status = 0;
+
+   pixfilter.erase( remove(pixfilter.begin(), pixfilter.end(), '['), pixfilter.end() );
+   pixfilter.erase( remove(pixfilter.begin(), pixfilter.end(), ']'), pixfilter.end() );
+
+   LOG_STREAM << "filter expr: " << pixfilter << endl;
+
+   char *expr = (char*)pixfilter.c_str(); /* I - Image section expression */
+
+   fitsfile *fptr;    /* I - pointer to input image */
+   fitsfile *newfptr; /* I - pointer to output image */
+
+   const char * cfilename = filename.c_str();
+
+   fits_open_diskfile(&fptr, cfilename, READONLY, &status);
+   if (status)
+   {
+      string errmsg{ fitsfiles::cfitsio_errmsg(__FILE__, __LINE__, status) };
+      throw runtime_error("fits_open_file " + filename + " failed: " + errmsg);
+   }
+
+   fits_create_file(&newfptr, "stream://", &status);
+   if (status)
+   {
+      string errmsg{ fitsfiles::cfitsio_errmsg(__FILE__, __LINE__, status) };
+      throw runtime_error("fits_open_file to stream failed: " + errmsg);
+   }
+
+   int rc = fits_copy_image_section2(fptr, newfptr, expr, &status);
+   if (status)
+   {
+      string errmsg{fitsfiles::cfitsio_errmsg(__FILE__, __LINE__, status)};
+      throw runtime_error("fits_copy_image_section " + filename
+            + " to cut-file with " + string{expr} + " failed: " + errmsg);
+   }
+
+   fits_close_file(fptr, &status);
+   if (status)
+   {
+      string errmsg{fitsfiles::cfitsio_errmsg(__FILE__, __LINE__, status)};
+      throw runtime_error("fits_close_file " + filename + " failed: " + errmsg);
+   }
+
+   fits_close_file(newfptr, &status);
+   if (status)
+   {
+      string errmsg{fitsfiles::cfitsio_errmsg(__FILE__, __LINE__, status)};
+      throw runtime_error("fits_close_file cut failed: " + errmsg);
+   }
+
+   return rc;
+}
+
+
+/* ---- version 3.49 ------- */
+/*
+ * Currently this is a direct copy from cfitsio-lib. It will be modified in two ways:
+ * - allow bigger buffer the first row (use sub-cube up to dimension k: volume[1.k] < LIMIT)
+ * - duplicate buffer and run reader on parallel-thread to avoid reader-write wait for each other
+ */
+int fits_copy_image_section2(
+      fitsfile *fptr,  /* I - pointer to input image */
+      fitsfile *newptr,  /* I - pointer to output image */
+      char *expr,       /* I - Image section expression    */
+      int *status)
+{
+   /*
+      copies an image section from the input file to a new output HDU
+      */
+
+   int bitpix, naxis, numkeys, nkey;
+   long naxes[] = {1,1,1,1,1,1,1,1,1}, smin, smax, sinc;
+   long fpixels[] = {1,1,1,1,1,1,1,1,1};
+   long lpixels[] = {1,1,1,1,1,1,1,1,1};
+   long incs[] = {1,1,1,1,1,1,1,1,1};
+   char *cptr, keyname[FLEN_KEYWORD], card[FLEN_CARD];
+   int ii, tstatus, anynull;
+   long minrow, maxrow, minslice, maxslice, mincube, maxcube;
+   long firstpix;
+   long ncubeiter, nsliceiter, nrowiter, kiter, jiter, iiter;
+   int klen, kk, jj;
+   long outnaxes[9], outsize, buffsize;
+   double *buffer, crpix, cdelt;
+
+   if (*status > 0)
+      return(*status);
+
+   /* get the size of the input image */
+   fits_get_img_type(fptr, &bitpix, status);
+   fits_get_img_dim(fptr, &naxis, status);
+   if (fits_get_img_size(fptr, naxis, naxes, status) > 0)
+      return(*status);
+
+   if (naxis < 1 || naxis > 4)
+   {
+      ffpmsg(
+            "Input image either had NAXIS = 0 (NULL image) or has > 4 dimensions");
+      return(*status = BAD_NAXIS);
+   }
+
+   /* create output image with same size and type as the input image */
+   /*  Will update the size later */
+   fits_create_img(newptr, bitpix, naxis, naxes, status);
+
+   /* copy all other non-structural keywords from the input to output file */
+   fits_get_hdrspace(fptr, &numkeys, NULL, status);
+
+   for (nkey = 4; nkey <= numkeys; nkey++) /* skip the first few keywords */
+   {
+      fits_read_record(fptr, nkey, card, status);
+
+      if (fits_get_keyclass(card) > TYP_CMPRS_KEY)
+      {
+         /* write the record to the output file */
+         fits_write_record(newptr, card, status);
+      }
+   }
+
+   if (*status > 0)
+   {
+      ffpmsg("error copying header from input image to output image");
+      return(*status);
+   }
+
+   /* parse the section specifier to get min, max, and inc for each axis */
+   /* and the size of each output image axis */
+
+   cptr = expr;
+   for (ii=0; ii < naxis; ii++)
+   {
+      if (fits_get_section_range(&cptr, &smin, &smax, &sinc, status) > 0)
+      {
+         ffpmsg("error parsing the following image section specifier:");
+         ffpmsg(expr);
+         return(*status);
+      }
+
+      if (smax == 0)
+         smax = naxes[ii];   /* use whole axis  by default */
+      else if (smin == 0)
+         smin = naxes[ii];   /* use inverted whole axis */
+
+      if (smin > naxes[ii] || smax > naxes[ii])
+      {
+         ffpmsg("image section exceeds dimensions of input image:");
+         ffpmsg(expr);
+         return(*status = BAD_NAXIS);
+      }
+
+      fpixels[ii] = smin;
+      lpixels[ii] = smax;
+      incs[ii] = sinc;
+
+      if (smin <= smax)
+         outnaxes[ii] = (smax - smin + sinc) / sinc;
+      else
+         outnaxes[ii] = (smin - smax + sinc) / sinc;
+
+      /* modify the NAXISn keyword */
+      fits_make_keyn("NAXIS", ii + 1, keyname, status);
+      fits_modify_key_lng(newptr, keyname, outnaxes[ii], NULL, status);
+
+      /* modify the WCS keywords if necessary */
+
+      if (fpixels[ii] != 1 || incs[ii] != 1)
+      {
+         for (kk=-1;kk<26; kk++)  /* modify any alternate WCS keywords */
+         {
+            /* read the CRPIXn keyword if it exists in the input file */
+            fits_make_keyn("CRPIX", ii + 1, keyname, status);
+
+            if (kk != -1) {
+               klen = (int)strlen(keyname);
+               keyname[klen]= (char)((int)'A' + kk);
+               keyname[klen + 1] = '\0';
+            }
+
+            tstatus = 0;
+            if (fits_read_key(fptr, TDOUBLE, keyname,
+                     &crpix, NULL, &tstatus) == 0)
+            {
+               /* calculate the new CRPIXn value */
+               if (fpixels[ii] <= lpixels[ii]) {
+                  crpix = (crpix - double(fpixels[ii])) / double(incs[ii]) + 1.0;
+                  /*  crpix = (crpix - (fpixels[ii] - 1.0) - .5) / incs[ii] + 0.5; */
+               } else {
+                  crpix = (double(fpixels[ii]) - crpix)  / double(incs[ii]) + 1.0;
+                  /* crpix = (fpixels[ii] - (crpix - 1.0) - .5) / incs[ii] + 0.5; */
+               }
+
+               /* modify the value in the output file */
+               fits_modify_key_dbl(newptr, keyname, crpix, 15, NULL, status);
+
+               if (incs[ii] != 1 || fpixels[ii] > lpixels[ii])
+               {
+                  /* read the CDELTn keyword if it exists in the input file */
+                  fits_make_keyn("CDELT", ii + 1, keyname, status);
+
+                  if (kk != -1) {
+                     klen = (int)strlen(keyname);
+                     keyname[klen]=(char)((int)'A' + kk);
+                     keyname[klen + 1] = '\0';
+                  }
+
+                  tstatus = 0;
+                  if (fits_read_key(fptr, TDOUBLE, keyname,
+                           &cdelt, NULL, &tstatus) == 0)
+                  {
+                     /* calculate the new CDELTn value */
+                     if (fpixels[ii] <= lpixels[ii])
+                        cdelt = cdelt * double(incs[ii]);
+                     else
+                        cdelt = cdelt * double(-incs[ii]);
+
+                     /* modify the value in the output file */
+                     fits_modify_key_dbl(newptr, keyname, cdelt, 15, NULL, status);
+                  }
+
+                  /* modify the CDi_j keywords if they exist in the input file */
+
+                  fits_make_keyn("CD1_", ii + 1, keyname, status);
+
+                  if (kk != -1) {
+                     klen = (int)strlen(keyname);
+                     keyname[klen]=(char)((int)'A' + kk);
+                     keyname[klen + 1] = '\0';
+                  }
+
+                  for (jj=0; jj < 9; jj++)   /* look for up to 9 dimensions */
+                  {
+                     keyname[2] = (char)((int)'1' + jj);
+
+                     tstatus = 0;
+                     if (fits_read_key(fptr, TDOUBLE, keyname,
+                              &cdelt, NULL, &tstatus) == 0)
+                     {
+                        /* calculate the new CDi_j value */
+                        if (fpixels[ii] <= lpixels[ii])
+                           cdelt = cdelt * double(incs[ii]);
+                        else
+                           cdelt = cdelt * double(-incs[ii]);
+
+                        /* modify the value in the output file */
+                        fits_modify_key_dbl(newptr, keyname, cdelt, 15, NULL, status);
+                     }
+                  }
+
+               } /* end of if (incs[ii]... loop */
+            }   /* end of fits_read_key loop */
+         }    /* end of for (kk  loop */
+      }
+   }  /* end of main NAXIS loop */
+
+   if (ffrdef(newptr, status) > 0)  /* force the header to be scanned */
+   {
+      return(*status);
+   }
+
+   /* turn off any scaling of the pixel values */
+   fits_set_bscale(fptr,  1.0, 0.0, status);
+   fits_set_bscale(newptr, 1.0, 0.0, status);
+
+   /* to reduce memory foot print, just read/write image 1 row at a time */
+
+   outsize = outnaxes[0];
+   buffsize = (abs(bitpix) / 8) * outsize;
+
+   buffer = (double *) malloc(buffsize); /* allocate memory for the image row */
+   if (!buffer)
+   {
+      ffpmsg("fits_copy_image_section: no memory for image section");
+      return(*status = MEMORY_ALLOCATION);
+   }
+   /* read the image section then write it to the output file */
+
+   minrow = fpixels[1];
+   maxrow = lpixels[1];
+   if (minrow > maxrow) {
+      nrowiter = (minrow - maxrow + incs[1]) / incs[1];
+   } else {
+      nrowiter = (maxrow - minrow + incs[1]) / incs[1];
+   }
+
+   minslice = fpixels[2];
+   maxslice = lpixels[2];
+   if (minslice > maxslice) {
+      nsliceiter = (minslice - maxslice + incs[2]) / incs[2];
+   } else {
+      nsliceiter = (maxslice - minslice + incs[2]) / incs[2];
+   }
+
+   mincube = fpixels[3];
+   maxcube = lpixels[3];
+   if (mincube > maxcube) {
+      ncubeiter = (mincube - maxcube + incs[3]) / incs[3];
+   } else {
+      ncubeiter = (maxcube - mincube + incs[3]) / incs[3];
+   }
+
+   firstpix = 1;
+   for (kiter = 0; kiter < ncubeiter; kiter++)
+   {
+      if (mincube > maxcube) {
+         fpixels[3] = mincube - (kiter * incs[3]);
+      } else {
+         fpixels[3] = mincube + (kiter * incs[3]);
+      }
+
+      lpixels[3] = fpixels[3];
+
+      for (jiter = 0; jiter < nsliceiter; jiter++)
+      {
+         if (minslice > maxslice) {
+            fpixels[2] = minslice - (jiter * incs[2]);
+         } else {
+            fpixels[2] = minslice + (jiter * incs[2]);
+         }
+
+         lpixels[2] = fpixels[2];
+
+         for (iiter = 0; iiter < nrowiter; iiter++)
+         {
+            if (minrow > maxrow) {
+               fpixels[1] = minrow - (iiter * incs[1]);
+            } else {
+               fpixels[1] = minrow + (iiter * incs[1]);
+            }
+
+            lpixels[1] = fpixels[1];
+
+            if (bitpix == 8)
+            {
+               ffgsvb(fptr, 1, naxis, naxes, fpixels, lpixels, incs, 0,
+                     (unsigned char *) buffer, &anynull, status);
+
+               ffpprb(newptr, 1, firstpix, outsize, (unsigned char *) buffer, status);
+            }
+            else if (bitpix == 16)
+            {
+               ffgsvi(fptr, 1, naxis, naxes, fpixels, lpixels, incs, 0,
+                     (short *) buffer, &anynull, status);
+
+               ffppri(newptr, 1, firstpix, outsize, (short *) buffer, status);
+            }
+            else if (bitpix == 32)
+            {
+               ffgsvk(fptr, 1, naxis, naxes, fpixels, lpixels, incs, 0,
+                     (int *) buffer, &anynull, status);
+
+               ffpprk(newptr, 1, firstpix, outsize, (int *) buffer, status);
+            }
+            else if (bitpix == -32)
+            {
+               ffgsve(fptr, 1, naxis, naxes, fpixels, lpixels, incs, FLOATNULLVALUE,
+                     (float *) buffer, &anynull, status);
+
+               ffppne(newptr, 1, firstpix, outsize, (float *) buffer, FLOATNULLVALUE, status);
+            }
+            else if (bitpix == -64)
+            {
+               ffgsvd(fptr, 1, naxis, naxes, fpixels, lpixels, incs, DOUBLENULLVALUE,
+                     buffer, &anynull, status);
+
+               ffppnd(newptr, 1, firstpix, outsize, buffer, DOUBLENULLVALUE,
+                     status);
+            }
+            else if (bitpix == 64)
+            {
+               ffgsvjj(fptr, 1, naxis, naxes, fpixels, lpixels, incs, 0,
+                     (LONGLONG *) buffer, &anynull, status);
+
+               ffpprjj(newptr, 1, firstpix, outsize, (LONGLONG *) buffer, status);
+            }
+
+
+            firstpix += outsize;
+         }
+      }
+   }
+
+   free(buffer);  /* finished with the memory */
+
+   if (*status > 0)
+   {
+      ffpmsg("fits_copy_image_section: error copying image section");
+      return(*status);
+   }
+
+   return(*status);
+}
+
diff --git a/data-access/engine/src/vlkb/src/imcopy.cpp.notes b/data-access/engine/src/vlkb/src/imcopy.cpp.notes
new file mode 100644
index 0000000000000000000000000000000000000000..731f1691e752fa6960a7125257400cdcb0312c8d
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/imcopy.cpp.notes
@@ -0,0 +1,258 @@
+#include <string.h>
+#include <stdio.h>
+#include <stdlib.h>
+#include "fitsio.h"
+
+#include "imcopy.hpp"
+#include "io.hpp"
+
+int imcopy(std::string filename)
+//int main(int argc, char *argv[])
+{
+   LOG_trace(__func__);
+
+    fitsfile *infptr, *outfptr;   /* FITS file pointers defined in fitsio.h */
+    int status = 0, tstatus, ii = 1, iteration = 0, single = 0, hdupos;
+    int hdutype, bitpix, bytepix, naxis = 0, nkeys, datatype = 0, anynul;
+    long naxes[9] = {1, 1, 1, 1, 1, 1, 1, 1, 1};
+    long long first, totpix = 0, npix;
+    double *array, bscale = 1.0, bzero = 0.0, nulval = 0.;
+    char card[81];
+
+    /* Open the input file and create output file */
+    std::cerr << "BEFORE fits_open_file()" << std::endl;
+
+    fits_open_file(&infptr, filename.c_str(), READONLY, &status);
+
+    fits_close_file(infptr, &status);
+
+      return 0;
+
+   // FINAL: ideally re-implement (with buffering) fits_copy_image_section 
+   // relies on func set for each type:
+   //  fits_read_subset_dbl  ffgsvd
+   //   fits_write_imgnull_dbl ffppnd
+   //
+   //   but copies always entire image in one step which for huge cuts runs out of mem.
+   //   e.g. re-do but in smaller chunks
+   // OR check is there something similar already done ?  
+
+
+
+
+    std::cerr << "AFTER  fits_open_file() stsus: " << status << std::endl;
+   LOG_STREAM << "fits_open_file()" << std::endl;
+      LOG_STREAM.flush();
+    //status = CREATE_DISK_FILE;
+    fits_create_file(&outfptr, "-", &status);
+    std::cerr << "AFTER  fits_create_file()" << std::endl;
+    //fits_create_file(&outfptr, argv[2], &status);
+
+
+   // NOTES two independent problems:
+   // * redirect to stdout '-' causes 100% mem usage and 'crash'
+   // * extended syntax with pixel ranges at fits_open will reserve memory for all the area
+   //   opening with fits_diskfile (no extended syntax) -> mem at same level, ok.
+   // * e.g. copying wihtout extended-syntax to filename works ok also for big files: copies all file
+   //
+   // After seeing cfitsio ources cfileio.c :
+   // imcopy could be implemented with only fit_open_file(); and fits_close_file() with filename:
+   // /path/to/fitsfile.fits(temp_cut_uniqfname.fits)[a:B c:d]
+   //
+   // open and closing such file creates temp_cut_uniqfname.fits with the subimg
+   // if fname not given fitsfile is created in memory -> problem if it is too big
+   // Dirty solution: 
+   // check size based on filterstring [a:b c:d ...] -> pixelsize
+   // if size > LIMIT_IN_PIXELS=100*1024*1024 : assuming double 4 bytes -> 4*LIMIT_IN_PIXELS ~ 500MB
+   //    use filename(tempfilename)[] -> operate on disk and then open tempfilename and stream it to stdout
+   // else
+   //    do as now: outfile = '-' & infile = filename[pixfilter]
+   //
+
+    // in detail:
+   // what does '-' do to fits_create_file (how is redirect implemented in cfitsio) ?
+   // what does extended syntax[pixels] do to fits_open_file ?
+
+
+   LOG_STREAM << "fits_create_file()" << std::endl;
+
+      LOG_STREAM.flush();
+
+    if (status != 0) {    
+        fits_report_error(stderr, status);
+        return(status);
+    }
+
+    fits_get_hdu_num(infptr, &hdupos);  /* Get the current HDU position */
+         std::cerr << "hdupos: " << hdupos << std::endl;
+
+    /* Copy only a single HDU if a specific extension was given */ 
+    //if (hdupos != 1 || strchr(filename.c_str(), '[')) single = 1;
+
+    //for (; !status; hdupos++)  /* Main loop through each extension */
+    {
+
+      fits_get_hdu_type(infptr, &hdutype, &status);
+
+      if (hdutype == IMAGE_HDU) {
+         std::cerr << "hdutype IMAGE_HDU: " << hdutype << std::endl;
+
+          /* get image dimensions and total number of pixels in image */
+          for (ii = 0; ii < 9; ii++)
+              naxes[ii] = 1;
+
+          fits_get_img_param(infptr, 9, &bitpix, &naxis, naxes, &status);
+
+          totpix = naxes[0] * naxes[1] * naxes[2] * naxes[3] * naxes[4]
+             * naxes[5] * naxes[6] * naxes[7] * naxes[8];
+      }
+
+      if (hdutype != IMAGE_HDU || naxis == 0 || totpix == 0) { 
+
+         std::cerr << "not IMAGE_HDO ???" << std::endl;
+
+          /* just copy tables and null images */
+          //fits_copy_hdu(infptr, outfptr, 0, &status);
+
+      } else {
+
+         std::cerr << "bitpix naxis naxes[]: " << bitpix << " " << naxis;
+          LOG_STREAM << "bitpix naxis naxes[]: " << bitpix << " " << naxis;
+          for(int ii=0; ii<10; ii++) std::cerr << " "<< naxes[ii];
+          for(int ii=0; ii<10; ii++) LOG_STREAM << " "<< naxes[ii];
+          std::cerr << std::endl;
+          LOG_STREAM << std::endl;
+
+          /* Explicitly create new image, to support compression */
+          fits_create_img(outfptr, bitpix, naxis, naxes, &status);
+          if (status) {
+                 fits_report_error(stderr, status);
+                 return(status);
+          }
+/*
+          if (fits_is_compressed_image(outfptr, &status)) {
+
+             LOG_STREAM << "creating EXTNAME COMPRESSED_IMAGE" << std::endl;
+
+             /* write default EXTNAME keyword if it doesn't already exist * /
+             tstatus = 0;
+             fits_read_card(infptr, "EXTNAME", card, &tstatus);
+             if (tstatus) {
+                strcpy(card, "EXTNAME = 'COMPRESSED_IMAGE'   / name of this binary table extension");
+                fits_write_record(outfptr, card, &status);
+             }
+          }
+*/
+          /* copy all the user keywords (not the structural keywords) */
+          fits_get_hdrspace(infptr, &nkeys, NULL, &status); 
+         std::cerr << "nkeys: " << nkeys << std::endl;
+
+          for (ii = 1; ii <= nkeys; ii++) {
+             fits_read_record(infptr, ii, card, &status);
+             if (fits_get_keyclass(card) > TYP_CMPRS_KEY)
+                fits_write_record(outfptr, card, &status);
+          }
+
+          /* delete default EXTNAME keyword if it exists */
+          /*
+             if (!fits_is_compressed_image(outfptr, &status)) {
+             tstatus = 0;
+             fits_read_key(outfptr, TSTRING, "EXTNAME", card, NULL, &tstatus);
+             if (!tstatus) {
+             if (strcmp(card, "COMPRESSED_IMAGE") == 0)
+             fits_delete_key(outfptr, "EXTNAME", &status);
+             }
+             }
+             */
+
+          switch(bitpix) {
+             case BYTE_IMG:
+                datatype = TBYTE;
+                break;
+             case SHORT_IMG:
+                datatype = TSHORT;
+                break;
+             case LONG_IMG:
+                datatype = TINT;
+                break;
+             case FLOAT_IMG:
+                datatype = TFLOAT;
+                break;
+             case DOUBLE_IMG:
+                datatype = TDOUBLE;
+                break;
+          }
+
+          bytepix = abs(bitpix) / 8;
+
+          npix = totpix;
+          iteration = 0;
+
+          while(npix > 16*1024*1024) npix = npix / 2;
+
+          /* try to allocate memory for the entire image */
+          /* use double type to force memory alignment */
+          array = (double *) calloc(npix, bytepix);
+
+          /* if allocation failed, divide size by 2 and try again */
+          while (!array && iteration < 10)  {
+             iteration++;
+             npix = npix / 2;
+             array = (double *) calloc(npix, bytepix);
+          }
+
+          LOG_STREAM << "npix bytepix totpix :" << npix << " " << bytepix << " " << totpix << std::endl;
+          std::cerr << "npix bytepix totpix :" << npix << " " << bytepix << " " << totpix << std::endl;
+
+          if (!array)  {
+             printf("Memory allocation error\n");
+             return(0);
+          }
+
+          /* turn off any scaling so that we copy the raw pixel values */
+          fits_set_bscale(infptr,  bscale, bzero, &status);
+          fits_set_bscale(outfptr, bscale, bzero, &status);
+
+          first = 1;
+          while (totpix > 0 && !status)
+          {
+         //    LOG_STREAM << " "<< first ; LOG_STREAM.flush();
+         //    std::cerr << " "<< first ;
+             /* read all or part of image then write it back to the output file */
+             fits_read_img(infptr, datatype, first, npix, 
+                   &nulval, array, &anynul, &status);
+
+
+             fits_write_img(outfptr, datatype, first, npix, array, &status);
+             fits_flush_file(outfptr, &status);
+
+             totpix = totpix - npix;
+             first  = first  + npix;
+          }
+          LOG_STREAM << " X "<< first << std::endl;LOG_STREAM.flush();
+          std::cerr << " X "<< first << std::endl;
+
+          free(array);
+          LOG_STREAM << " mem freed " << std::endl;LOG_STREAM.flush();
+          std::cerr << " mem freed " << std::endl;
+
+          fits_close_file(infptr, &status);
+          std::cerr << " close file in " << std::endl;
+
+
+      }
+
+      //if (single) break;  /* quit if only copying a single HDU */
+      //fits_movrel_hdu(infptr, 1, NULL, &status);  /* try to move to next HDU */
+    }
+
+          std::cerr << " HDU done " << std::endl;
+    if (status == END_OF_FILE)  status = 0; /* Reset after normal error */
+
+    fits_close_file(outfptr,  &status);
+          std::cerr << " close file out " << std::endl;
+    /* if error occurred, print out error message */
+    //if (status)
+    //   fits_report_error(stderr, status);
+    return(status);
+}
diff --git a/data-access/engine/src/vlkb/src/imcopy.hpp b/data-access/engine/src/vlkb/src/imcopy.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..f58eadb45cc5e787de3ed589c761f7cc1566a96f
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/imcopy.hpp
@@ -0,0 +1,9 @@
+#ifndef IMCOPY_HPP
+#define IMCOPY_HPP
+
+#include <string>
+
+int imcopy(std::string filename, int extnum, std::string pixfilter, std::string temp_root);
+int stream_cutout(std::string pathname, int extnum, std::string region);
+#endif
+
diff --git a/data-access/engine/src/vlkb/src/main.cpp b/data-access/engine/src/vlkb/src/main.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..300bfd1ed3a30c49ebd53a185c856242c8120e5a
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/main.cpp
@@ -0,0 +1,617 @@
+
+#include "removecard.hpp"
+#include "mergefiles.hpp"
+#include "dropdegen.hpp"
+#include "multicutout.hpp"
+#include "ast.hpp"
+#include "imcopy.hpp" // imcopy() vlkb_cutout()
+#include "ast4vl.hpp" // uint_bounds needed
+#include "fitsfiles.hpp"
+#include "service_string.hpp" // to_cftsio_format() needed
+
+//#include "config.hpp"
+#include "io.hpp"
+
+#include <algorithm> // replace needed
+#include <iostream>
+#include <sstream>
+
+#include <assert.h>
+#include <libgen.h> // basename()
+
+
+//std::ostream& OUT_STREAM = std::cout;
+//std::ostream& ERROR_STREAM = std::cerr;
+
+
+using namespace std;
+
+namespace vlkb
+{
+   //config conf;
+   const string fits_path=".";
+   const string fits_cut_path=".";
+
+
+   //--------------------------------------------
+   // util funcs
+   //--------------------------------------------
+
+   void usage(const std::string progname)
+   {
+      std::cerr
+         << "Usage: " << progname << " <command> [cmd-options] [cmd-args]" << endl
+         << "\n where commands are:\n "
+         << "\n\t cutout imcopy cutpixels multicutout mergefiles\n"
+         << "\n\t listbounds skyvertices overlap\n"
+         << "\n\t nullvals dropdegen checkcard addcard modcard rawdelcard\n"
+         << std::endl
+         << "Version: " << VERSIONSTR << " " << BUILD << std::endl;
+   }
+
+
+
+   // program recognizes following commands
+
+   enum cmd_set {
+      multicutout, mergefiles, cutout, imcopy, cutpixels,
+      listbounds, overlap, skyvertices, nullvals, dropdegen, checkcard, addcard, modcard, rawdelcard};
+
+
+   // from bash or interpreters usually receive params as strings
+
+   cmd_set to_cmd(const std::string cmdstr)
+   {
+      cmd_set cmd;
+
+      if(cmdstr.compare("multicutout") == 0)  cmd = multicutout;
+      else if(cmdstr.compare("mergefiles") == 0)   cmd = mergefiles;
+      else if(cmdstr.compare("cutout") == 0)    cmd = cutout;
+      else if(cmdstr.compare("imcopy") == 0)    cmd = imcopy;
+      else if(cmdstr.compare("cutpixels") == 0)    cmd = cutpixels;
+      else if(cmdstr.compare("listbounds") == 0)   cmd = listbounds;
+      else if(cmdstr.compare("overlap") == 0)      cmd = overlap;
+      else if(cmdstr.compare("skyvertices") == 0)  cmd = skyvertices;
+      else if(cmdstr.compare("nullvals") == 0)    cmd = nullvals;
+      else if(cmdstr.compare("dropdegen") == 0)    cmd = dropdegen;
+      else if(cmdstr.compare("checkcard") == 0)    cmd = checkcard;
+      else if(cmdstr.compare("addcard") == 0)     cmd = addcard;
+      else if(cmdstr.compare("modcard") == 0)      cmd = modcard;
+      else if(cmdstr.compare("rawdelcard") == 0)   cmd = rawdelcard;
+      else
+      {
+         stringstream cmd_err;
+         cmd_err << "String \'" << cmdstr<< "\' not recognized as valid command." << endl;
+         throw(invalid_argument(cmd_err.str()));
+      }
+
+      return cmd;
+   }
+
+} // namespce
+
+
+
+//-----------------------------------------------------------
+// all commands handle [argc,argv] and return a code:
+// 0 - Ok: command produces result as described in 'usage'
+// non-zero: Warning, Error: response has other form that 
+// the one described in 'usage'
+//-----------------------------------------------------------
+
+
+int cmd_multicutout(int argc, char * argv[])
+{
+   int rc;
+
+   switch(argc)
+   {
+      case 2:
+         {
+            string json_request_filename(argv[1]);
+            std::string tgzfilename = multicutout(json_request_filename, vlkb::fits_path, vlkb::fits_cut_path);//vlkb::conf);
+            cout << tgzfilename << endl;
+            rc = EXIT_SUCCESS;
+         }
+         break;
+
+      default:
+         cerr << "Usage: multicutout <filename_with_json_request>" << endl;
+         rc = EXIT_FAILURE;
+   }
+
+   return rc;
+}
+
+
+
+int cmd_nullvals(int argc, char * argv[])
+{
+   int rc;
+
+   switch(argc)
+   {
+      case 2:
+         {
+            unsigned long long null_cnt;
+            unsigned long long total_cnt;
+            double fill_ratio = fitsfiles::calc_nullvals(argv[1], /*hdunum*/1, null_cnt, total_cnt);
+            cout << "fill ratio: " << fill_ratio << "%  null_cnt/total_cnt : " << null_cnt 
+               << "/" << total_cnt << endl;
+            rc = EXIT_SUCCESS;
+         }
+         break;
+
+      default:
+         std::cerr
+            << "Usage: nullvals <filename.fits>\n"
+            << std::endl
+            <<"Calculates number of undefined pixels (null values).\n"
+            << std::endl;
+         rc = EXIT_FAILURE;
+   }
+
+   return rc;
+}
+
+
+
+int cmd_mergefiles(int argc, char * argv[])
+{
+   if(argc < 2)
+   {
+      std::cerr
+         << "Usage:  mergefiles [filename.fits ...]\n"
+         << "\n"
+         << "Merge FITS-files supplied as argument.\n"
+         << "Examples: \n"
+         << "   vlkb mergefiles /rootvialactea/CHaMP/region6.fits /rootvialactea/CHaMP/region7.fits\n";
+      return EXIT_FAILURE;
+   }
+   else
+   {
+      const vector<string> files_to_merge(argv + 1, argv + argc);
+      for(string file : files_to_merge) cout << file << endl;
+      string merged_file = mergefiles(files_to_merge);
+      return EXIT_SUCCESS;
+   }
+}
+
+
+
+
+int cmd_imcopy(int argc, char * argv[])
+{
+   if((argc == 4) || (argc == 5))
+   {
+      std::string infilename{argv[1]};
+      int extnum = std::stoi(std::string{argv[2]});
+      std::string pixfilter{argv[3]};
+      std::string temp_root = ((argc == 5) ? argv[4] : "/tmp" );
+      int rc = imcopy(infilename, extnum, pixfilter, temp_root);
+      if(rc)
+         std::cout << "rc = " << rc << std::endl;
+      return EXIT_SUCCESS;
+   }
+   else
+   {
+      std::cerr
+         << "Usage:  imcopy filename.fits extnum [a:b c:d ...] <temp-root>\n"
+         << "\n"
+         << "Send to stdout a subimage of N-dimesional FITS-file HDU with N-element pixel filter.\n"
+         << "HDU is given by extension number (0=Primary HDU, 1=Ext1, ...)"
+         << "<temp-root> is rw storage to hold the fits-cut while streaming (optional, default is '/tmp')"
+         << "Examples: \n"
+         << "   vlkb imcopy /rootvialactea/CHaMP/region7.fits 0 [1:100,1:100,1:100]\n";
+      return EXIT_FAILURE;
+   }
+}
+
+
+
+int cmd_cutout(int argc, char * argv[])
+{
+   if (argc != 4)
+   {
+      std::cerr
+         << "Usage:  overlap <filename.fits> <extnum> <region>\n"
+         << "\n"
+         << "Calculate overlap between HDU in file and region.\n\nregion in JSON form of VO-SODA params. For example 'POS=CIRCLE 21.4458 -1.373 0.1' :\n"
+         " \'{\"pos\":{\"circle\":{\"lat\":-1.373,\"lon\":21.4458,\"radius\":0.1},\"system\":\"ICRS\"},\"service\":\"SUBIMG\"}\'\n";
+      return EXIT_FAILURE;
+   }
+   else
+   {
+      string pathname{argv[1]};
+      int extnum = std::stoi(std::string{argv[2]});
+      string region{argv[3]};
+
+      int rc = stream_cutout(pathname, extnum, region);
+      if(!rc)
+         return EXIT_SUCCESS;
+   }
+   return EXIT_FAILURE;
+}
+
+
+
+
+int cmd_cutpixels(int argc, char * argv[])
+{
+   if(argc < 2)
+   {
+      std::cerr
+         << "Usage:  cutpixles filename.fits[a:b,c:d,...]\n"
+         << "\n"
+         << "Cut part of N-dimesional FITS-file on pixel coordinates (using cfitsio extended syntax).\n"
+         << "Examples: \n"
+         << "   vlkb cutpixels /rootvialactea/CHaMP/region7.fits[1:100,1:100,1:100]\n";
+      return EXIT_FAILURE;
+   }
+   else
+   {
+      const unsigned long int hdunum = 1;
+      std::string infilename{argv[1]};
+
+      size_t last_dot   = infilename.find_last_of(".");
+      size_t last_slash = infilename.find_last_of("/");
+      size_t last_open_bracket  = infilename.find_last_of("[");
+      size_t last_close_bracket = infilename.find_last_of("]");
+
+      std::string pix_ranges = infilename.substr(last_open_bracket+1, last_close_bracket-last_open_bracket-1);
+      std::replace(pix_ranges.begin(), pix_ranges.end(), ':', '-');
+      std::replace(pix_ranges.begin(), pix_ranges.end(), ',', '_');
+
+      const std::string outfilename{infilename.substr(last_slash+1, last_dot-last_slash-1 ) + "_CUT_" + pix_ranges + ".fits"};
+
+      fitsfiles::fits_hdu_cut(infilename, hdunum, outfilename);
+      return EXIT_SUCCESS;
+   }
+}
+
+
+
+int cmd_listbounds(int argc, char * argv[])
+{
+   int rc;
+
+   if (!((argc == 2) || (argc == 3) || (argc == 4)))
+   {
+      std::cerr
+         << "Usage:  listbounds <filename.fits> <SkySystem> <SpecSystem>\n"
+         << "\n"
+         << "List the bounds of the region represented by FITS header.\n"
+         << "Arguments:\n"
+         << "   SkySystem  GALACTIC or ICRS\n"
+         << "   SpecSystem System=VELO,StdOfRest=LSRK,Unit=km/s or WAVE, Barycentric, m\n";
+      rc = EXIT_FAILURE;
+   }
+   else
+   {
+      string pathname(argv[1]);
+
+      string skySystem;
+      string specSystem;
+
+      if(argc >= 3)
+      {
+         skySystem = argv[2];
+      }
+      if(argc == 4)
+      {
+         specSystem = argv[3];
+      }
+
+      cout << string{argv[0]} << ": " << pathname << " '" << skySystem  << "' " << specSystem << endl;
+
+      rc = vlkb_listbounds(skySystem, specSystem, pathname);
+      std::cout << "vlkb_listbounds rc: " << rc << std::endl;
+      rc = EXIT_SUCCESS;
+   }
+   return rc;
+}
+
+
+
+int cmd_skyvertices(int argc, char * argv[])
+{
+   int rc;
+
+   if (argc != 3)
+   {
+      std::cerr
+         << "Usage:  skyvertices <filename.fits[ext]> <SkySystem>\n"
+         << "\n"
+         << "List vertices in sky of the region represented by FITS header.\n"
+         << "Note SkySystem  GALACTIC or ICRS\n";
+      rc = EXIT_FAILURE;
+   }
+   else
+   {
+      string pathname(argv[1]);
+      string skySystem(argv[2]);
+
+      cout << string{argv[0]} << ": " << pathname << " '" << skySystem  << "' " << endl;
+
+      rc = vlkb_skyvertices(pathname, skySystem);
+      std::cout << "vlkb_skyvertices rc: " << rc << std::endl;
+      rc = EXIT_SUCCESS;
+   }
+   return rc;
+}
+
+
+char overlapmsg[7][512] = {
+   "0 - The check could not be performed because the second Region could not be mapped into the coordinate system of the first Region.",
+   "1 - There is no overlap between the two Regions.",
+   "2 - The first Region is completely inside the second Region.",
+   "3 - The second Region is completely inside the first Region.",
+   "4 - There is partial overlap between the two Regions.",
+   "5 - The Regions are identical to within their uncertainties.",
+   "6 - The second Region is the exact negation of the first Region to within their uncertainties."
+};
+
+int cmd_overlap(int argc, char * argv[])
+{
+   int rc;
+
+   bool is_last_x = (*argv[argc-1] == 'x');
+   if(is_last_x) argc--;
+
+   if (argc != 3)
+   {
+      std::cerr
+         << "Usage:  overlap <filename.fits[ext]> <region>\n"
+         << "\n"
+         << "Calculate overlap between HDU in file and region.\n\nregion in JSON form of VO-SODA params. For example 'POS=CIRCLE 21.4458 -1.373 0.1' :\n"
+         " \'{\"pos\":{\"circle\":{\"lat\":-1.373,\"lon\":21.4458,\"radius\":0.1},\"system\":\"ICRS\"},\"service\":\"SUBIMG\"}\'\n";
+      rc = EXIT_FAILURE;
+   }
+   else
+   {
+      string pathname{argv[1]};
+      string region{argv[2]};
+
+      vector<uint_bounds> bnds;
+      int ov_code = vlkb_overlap(pathname, region, bnds);
+      if((ov_code >= 2) && (ov_code <= 5))
+      {
+         rc = EXIT_SUCCESS;
+         cout << to_cfitsio_format(bnds) << endl;
+      }
+      else if((ov_code == 1) || (ov_code == 6))
+      {
+         rc = EXIT_SUCCESS;
+      }
+      else
+      {
+         throw runtime_error("overlap code invalid: " + to_string(ov_code));
+      }
+   }
+   return rc;
+}
+
+
+int cmd_dropdegen(int argc, char * argv[])
+{
+   int rc;
+
+   if (argc != 2)
+   {
+      std::cerr
+         << "Usage:  dropdegen <filename.fits[ext]>\n"
+         << "\n"
+         << "Drop degenerate axis (axis with NAXISi=1). Given file will be overwritten! Make a copy before.\n"
+         << "Note that it may be necessary to enclose the input file\n"
+         << "name in single quote characters on the Unix command line.\n";
+      rc = EXIT_FAILURE;
+   }
+   else
+   {
+      rc = vlkb_dropdegen(argv[1]);
+      std::cout << "vlkb_dropdegen rc: " << rc << std::endl;
+      rc = EXIT_SUCCESS;
+   }
+   return rc;
+}
+
+
+int cmd_checkcard(int argc, char * argv[])
+{
+   int rc;
+
+   if (argc != 3)
+   {
+      std::cerr
+         << "Usage:  checkcard <keyname> <filename.fits>\n"
+         << "\n"
+         << "Checks if keyname present in fits header.\n";
+      rc = EXIT_FAILURE;
+   }
+   else
+   {
+      const unsigned int hdunum = 1;
+      string card{fitsfiles::read_card(argv[2], hdunum, argv[1])};
+      cout << card << endl;
+      rc = EXIT_SUCCESS;
+   }
+   return rc;
+}
+
+int cmd_addcard(int argc, char * argv[])
+{
+   int rc;
+
+   if (argc != 3)
+   {
+      std::cerr
+         << "Usage:  addcard key value <filename.fits>\n"
+         << "\n"
+         << "Adds card by key if it is missing from the header.\n";
+      rc = EXIT_FAILURE;
+   }
+   else
+   {
+      std::string key{argv[1]};
+      std::string value{argv[2]};
+      std::string filename{argv[3]};
+      cout << "NOT IMPLEMENTED: use modcard - it adds card if not exist" << endl;
+      rc = EXIT_FAILURE; // FIXME
+   }
+   return rc;
+}
+
+
+
+int cmd_modcard(int argc, char * argv[])
+{
+   int rc;
+
+   if (!((argc == 3) || (argc >= 4)))
+   {
+      std::cerr
+         << "Usage:  modcard token [-v newvalue] <*.fits>\n"
+         << "\n"
+         << "Overwrites value in key-record which contains string 'token' anywhere in card\n";
+      rc = EXIT_FAILURE;
+   }
+   else
+   {
+      cout << "argc: " << argc << endl;
+      string newvalue;
+      string token{argv[1]};
+      string arg2{argv[2]};
+
+      cout << "token: " << token << endl;
+      cout << "arg2: " << arg2 << endl;
+
+      int ii_start;
+      if(0 == arg2.compare("-v"))
+      {
+         newvalue = string{argv[3]};
+         ii_start = 4;
+      }
+      else
+      {
+         ii_start = 2;
+      }
+
+      cout << "newvalue: " << newvalue << endl;
+      cout << "files: " << endl;
+
+      int ii;
+      for(ii=ii_start; ii<argc; ii++)
+      {
+         cout << argv[ii] << endl;
+         rc = fitsfiles::mod_value(argv[ii], token, newvalue);
+         std::cout << "fitsfiles::mod_value rc: " << rc << std::endl;
+         rc = EXIT_SUCCESS;
+      }
+   }
+   return rc;
+}
+
+
+int cmd_rawdelcard(int argc, char * argv[])
+{
+   int rc;
+
+   if (argc < 3)
+   {
+      std::cerr
+         << "Usage:  rawdelcard keyname <*.fits>\n"
+         << "\n"
+         << "Extends keyname with spaces to 8 chars and removes all occurences from the header"
+         " ('raw': does not use cfitsio and so it can remove cards also from non-standard headers"
+         " (like when by mistake wrong card inserted before NAXISn card.)\n"
+         << "Note: currently implemented only for Primary HDU.\n";
+      rc = EXIT_FAILURE;
+   }
+   else
+   {
+      cout << "argc: " << argc << endl;
+      string newvalue;
+      string keyname{argv[1]};
+
+      cout << "keyname: " << keyname << endl;
+      cout << "files: " << endl;
+      //const unsigned int hdunum = 1;
+      int ii;
+      for(ii=2; ii<argc; ii++)
+      {
+         cout << argv[ii] << endl;
+         remove_raw_card(argv[ii], /*hdunum,*/ keyname);
+      }
+      rc = EXIT_SUCCESS;
+   }
+   return rc;
+}
+
+
+
+
+//-----------------------------------------------------------
+// main
+//-----------------------------------------------------------
+int main (int argc, char * argv[])
+{
+   const std::string progname = basename(argv[0]);
+
+   if( argc < 2 )
+   {
+      vlkb::usage(progname);
+      return EXIT_FAILURE;
+   }
+
+   LOG_open("/tmp", string("vlkb-") + string(argv[1]) + ".log");
+
+   int rc = EXIT_SUCCESS;
+   try
+   {
+      const vlkb::cmd_set cmd(vlkb::to_cmd(argv[1]));
+
+      int cmd_argc = argc - 1;
+      char ** cmd_argv = &(argv[1]);
+
+      switch(cmd)
+      {
+         case vlkb::cutout:      rc = cmd_cutout(cmd_argc, cmd_argv); break;
+         case vlkb::imcopy:      rc = cmd_imcopy(cmd_argc, cmd_argv); break;
+         case vlkb::cutpixels:   rc = cmd_cutpixels(cmd_argc, cmd_argv); break;
+         case vlkb::multicutout: rc = cmd_multicutout(cmd_argc, cmd_argv); break;
+         case vlkb::mergefiles:  rc = cmd_mergefiles(cmd_argc, cmd_argv); break;
+
+         case vlkb::nullvals:    rc = cmd_nullvals(cmd_argc, cmd_argv); break;
+         case vlkb::listbounds:  rc = cmd_listbounds(cmd_argc, cmd_argv); break;
+         case vlkb::skyvertices: rc = cmd_skyvertices(cmd_argc, cmd_argv); break;
+         case vlkb::overlap:     rc = cmd_overlap(cmd_argc, cmd_argv); break;
+
+         case vlkb::dropdegen:   rc = cmd_dropdegen(cmd_argc, cmd_argv); break;
+         case vlkb::checkcard:   rc = cmd_checkcard(cmd_argc, cmd_argv); break;
+         case vlkb::addcard:     rc = cmd_addcard(cmd_argc, cmd_argv); break;
+         case vlkb::modcard:     rc = cmd_modcard(cmd_argc, cmd_argv); break;
+         case vlkb::rawdelcard:  rc = cmd_rawdelcard(cmd_argc, cmd_argv); break;
+
+         default: assert(false);
+      }
+   }
+   catch(const invalid_argument& ex)
+   {
+      cerr << "invalid_argument: " << ex.what() << endl;
+      rc = EXIT_FAILURE;
+   }
+   catch(const runtime_error& ex)
+   {
+      cerr << "runtime_error: " << ex.what() << endl;
+      rc = EXIT_FAILURE;
+   }
+   catch(const exception& ex)
+   {
+      cerr <<  "exception: " << ex.what() << endl;
+      rc = EXIT_FAILURE;
+   }
+
+   LOG_close();
+   return rc;
+}
+
diff --git a/data-access/engine/src/vlkb/src/mergefiles.cpp b/data-access/engine/src/vlkb/src/mergefiles.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..71daa209cc28ccdc2f001bf8aae704ce8f15af98
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/mergefiles.cpp
@@ -0,0 +1,46 @@
+
+#include "mergefiles.hpp"
+
+#include "io.hpp"
+//#include "mcutout.hpp"
+#include <sys/param.h> // NAME_MAX PATH_MAX
+
+
+#include <string>
+#include <vector>
+
+#include <string.h>
+
+
+
+
+using namespace std;
+
+
+
+string mergefiles(const vector<string> filenames)
+{
+   const size_t fcnt = filenames.size();
+   char ffs[fcnt][PATH_MAX+NAME_MAX];
+
+   size_t i = 0;
+   for(string filename : filenames)
+      strcpy(ffs[i++], filename.c_str());
+/* 
+   char * fitsfs[fcnt];
+   for(i=0;i<fcnt;i++) fitsfs[i] = ffs[i];
+
+   struct merge_files mf = {"/tmp",".","X"};// = mroot mresdir prefix
+   char m_result[PATH_MAX+NAME_MAX];
+
+  
+   int rc = M4VL_mergefiles(&mf, fcnt, fitsfs, m_result, PATH_MAX+NAME_MAX);
+   if(rc != 0) cerr << "M4VL_mergefiles returned rc: " << to_string(rc) << endl;
+
+   return string(m_result);
+*/
+   return string("FIXME was implemented with M4VL_mergefiles which is common-internal. Re-implement with xmergefiles which is commp-API.");
+}
+
+
+
diff --git a/data-access/engine/src/vlkb/src/mergefiles.hpp b/data-access/engine/src/vlkb/src/mergefiles.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..2f0fca88bfd0f445426ef403d31ed32a70088d15
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/mergefiles.hpp
@@ -0,0 +1,9 @@
+#ifndef MERGEFILES_HPP
+#define MERGEFILES_HPP
+
+#include <string>
+#include <vector>
+
+std::string mergefiles(const std::vector<std::string> filenames);
+
+#endif
diff --git a/data-access/engine/src/vlkb/src/multicutout.cpp b/data-access/engine/src/vlkb/src/multicutout.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..52d8699f31a8395ba43d0c312eb069fae3ee027f
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/multicutout.cpp
@@ -0,0 +1,35 @@
+
+#include <fstream>
+#include <string>
+
+//#include "config.hpp"
+#include "json.hpp"
+#include "mcutout_nljson.hpp"
+#include "mcutout_ostream.hpp"
+#include "mcutout.hpp"
+#include "multicutout.hpp"
+
+
+using json = nlohmann::json;
+
+using namespace std;
+
+string multicutout(string json_request_filename, const string fits_path, const string fits_cut_path)
+{
+   const bool ASSERTS = true;
+
+   // read mcutout json from file
+
+   std::ifstream ifs(json_request_filename);
+   const std::string json_str( (std::istreambuf_iterator<char>(ifs) ),
+         (std::istreambuf_iterator<char>()    ) );
+
+   // do cutouts
+
+   json jcuts = json::parse(json_str, nullptr, ASSERTS);
+   vector<struct cut_param_s> cut_params = jcuts.get<vector<struct cut_param_s>>();
+   struct mcutout_res_s mres = mcutout(cut_params, fits_path, fits_cut_path);
+
+   return mres.tgz_filename;
+}
+
diff --git a/data-access/engine/src/vlkb/src/multicutout.hpp b/data-access/engine/src/vlkb/src/multicutout.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..e25184cbe5e99b87b04739d11a6fc5227ef956e6
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/multicutout.hpp
@@ -0,0 +1,14 @@
+#ifndef MULTICUTOUT_HPP
+#define MULTICUTOUT_HPP
+
+#include <string>
+
+//#include "config.hpp"
+
+/* FIXME: replace conf FITS-DIR and FITS-CUTDIR with argss (so no conf-file needed)
+ * and json file should be as AFTER resolver was run: ID resolved to localfiles
+ * (ID itself not needed) */
+std::string multicutout(std::string json_request_filename, const std::string fits_path, const std::string fits_cut_path);
+
+#endif
+
diff --git a/data-access/engine/src/vlkb/src/removecard.cpp b/data-access/engine/src/vlkb/src/removecard.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..ec6cf4b547feb1544858ce5f137c6099dfad4c0a
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/removecard.cpp
@@ -0,0 +1,90 @@
+
+#include "removecard.hpp"
+
+#include "fitsfiles.hpp"
+#include "io.hpp"
+
+#include <fstream>
+
+using namespace std;
+
+/* useful when card is misplaced in header: deletes card from current position and
+ * adds card to new position (decided by cfitsio fits_add_card) */
+void remove_raw_card(const std::string pathname, /*unsigned int hdunum,*/ std::string keyname)
+{
+   LOG_trace(__func__);
+
+   /* fitsfiles::reinsert_card(pathname, hdunum, keyname); */
+
+   const long HDU_BEGIN = 0;
+
+   char card[81];
+   int ii;
+   for(ii=0;ii<81;ii++) card[ii] = 0;
+
+   vector<string> cards;
+
+   cout << "keyname: "  << keyname << endl;
+
+   keyname += string(8-keyname.size() ,' ');
+
+   fstream fileBuffer(pathname, ios::in|ios::out|ios::binary);// in out = update file record (not truncate)
+   if (fileBuffer.is_open())
+   {
+      fileBuffer.seekg(HDU_BEGIN, ios::beg);
+      fileBuffer.read(card, 80);
+
+      string cardstr(card);
+      cards.push_back(cardstr);
+
+      cout << cards.size() << " " << cardstr << endl;
+
+      ii = 1;
+      while(cardstr.substr(0,8).compare("END     ") != 0)
+      {
+         cout << ii << " : " << cardstr << "<" << endl;
+         fileBuffer.read(card, 80);
+         cardstr = string(card);
+         ii++;
+
+         if(cardstr.substr(0,8).compare(keyname) == 0)
+            continue;
+         else
+            cards.push_back(cardstr);
+
+         cout << cards.size() << " " << cardstr << endl;
+      }
+      cout << ii << " : " << cardstr << "<" << endl;
+      cards.push_back(cardstr);
+   }
+
+   cout << cards.size() << endl;
+
+   fileBuffer.seekg(HDU_BEGIN, ios::beg);
+   for(string crd : cards)
+   {
+      cout << crd << endl;
+      fileBuffer.write(crd.c_str(), 80);
+      if(crd.substr(0,8).compare("END     ") == 0) break;
+   }
+
+   // add padding up to next 2880 border (and delete eventual duplicate END cards)
+
+   long fpos{fileBuffer.tellp()};
+
+   long block_fill = fpos % 2880;
+
+   cout << fpos << " " << block_fill << endl;
+
+   if(block_fill != 0)
+   {
+      long padding_len = 2880-block_fill;
+      cout << padding_len << endl;
+      string padding(padding_len, ' ');
+      cout <<'>' <<  padding << '<' <<  endl;
+      fileBuffer.write(padding.c_str(), padding_len);
+   }
+
+   fileBuffer.close();
+}
+
diff --git a/data-access/engine/src/vlkb/src/removecard.hpp b/data-access/engine/src/vlkb/src/removecard.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..9436ff4d2a5059784ae755f14ac128e3b124c06d
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/removecard.hpp
@@ -0,0 +1,10 @@
+#ifndef REMOVECARD_HPP
+#define REMOVECARD_HPP
+
+
+#include <string>
+
+
+void remove_raw_card(const std::string pathname, /*unsigned int hdunum,*/ const std::string keyname);
+
+#endif
diff --git a/data-access/engine/src/vlkb/src/service_string.cpp b/data-access/engine/src/vlkb/src/service_string.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..8a77503515a56c951f5de963fec6489767d25bb9
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/service_string.cpp
@@ -0,0 +1,34 @@
+
+#include "service_string.hpp"
+#include <stdexcept>
+
+using namespace std;
+
+skysystem to_skysystem(std::string str)
+{
+   if(str.compare("GALACTIC") == 0) return skysystem::GALACTIC;
+   else if(str.compare("ICRS") == 0) return skysystem::ICRS;
+   else throw invalid_argument("string must be GALACTIC or ICRS but was " + str);
+}
+
+specsystem to_specsystem(std::string str)
+{
+   if(str.compare("NONE") == 0) return specsystem::NONE;
+   else if(str.compare("VELO_LSRK") == 0) return specsystem::VELO_LSRK;
+   else if(str.compare("WAVE_Barycentric") == 0) return specsystem::WAVE_Barycentric;
+   else throw invalid_argument("string must be NONE, VELO_LSRK or WAVE_Barycentric but was " + str);
+}
+
+
+
+specsystem to_specsystem(int i)// special case for legacy interface, remove later
+{
+   switch(i)
+   {
+      case 0: return specsystem::NONE;
+      case 1: return specsystem::VELO_LSRK;
+      case 2: return specsystem::WAVE_Barycentric;
+      default: return specsystem::NONE;
+   }
+}
+
diff --git a/data-access/engine/src/vlkb/src/service_string.hpp b/data-access/engine/src/vlkb/src/service_string.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..416dc21c422a0c1f9060585e77be078aa85c05d4
--- /dev/null
+++ b/data-access/engine/src/vlkb/src/service_string.hpp
@@ -0,0 +1,15 @@
+#ifndef SERVICE_STRING_HPP
+#define SERVICE_STRING_HPP
+
+#include "cutout.hpp"
+#include "ast4vl.hpp" // uint_bounds needed
+#include <string>
+#include <vector>
+
+
+skysystem to_skysystem(std::string str);
+specsystem to_specsystem(std::string str);
+specsystem to_specsystem(int i);
+std::string to_cfitsio_format(std::vector<uint_bounds> bounds);
+
+#endif
diff --git a/data-access/engine/src/vlkb/vlkb-overlap-region.schema.json b/data-access/engine/src/vlkb/vlkb-overlap-region.schema.json
new file mode 100644
index 0000000000000000000000000000000000000000..274f337a8d824f4832cb708f74ab583f6bf01d2c
--- /dev/null
+++ b/data-access/engine/src/vlkb/vlkb-overlap-region.schema.json
@@ -0,0 +1,116 @@
+{
+        "title": "vlkb overlap <fits> <region>",
+        "description": "Describe region JSON-string",
+        "type": "object",
+        "properties": {
+                "pos": {
+                        "type": "object",
+                        "properties": {
+                                "system": {
+                                        "description": "GALACTIC ICRS",
+                                        "type": "array",
+                                        "items": {
+                                                "type": "string",
+                                                "enum": ["GALACTIC", "ICRS"]
+                                        }
+                                },
+                                "oneOf": [
+                                        {
+                                                "type": "object",
+                                                "properties":{
+                                                        "circle": {
+                                                                "description": "CIRCLE lon lat radius",
+                                                                "type": "object",
+                                                                "properties": {
+                                                                        "lon": {"type": "number"},
+                                                                        "lat": {"type": "number"},
+                                                                        "radius": {"type": "number"}
+                                                                },
+                                                                "required": ["lon", "lat", "radius"]
+                                                        }
+                                                }
+                                        },
+                                        {
+                                                "type": "object",
+                                                "properties":{
+                                                        "range": {
+                                                                "description": "RANGE lon1 lon2 lat1 lat2",
+                                                                "type": "object",
+                                                                "properties": {
+                                                                        "lon1": {"type": "number"},
+                                                                        "lon2": {"type": "number"},
+                                                                        "lat1": {"type": "number"},
+                                                                        "lat2": {"type": "number"}
+                                                                },
+                                                                "required": ["lon1", "lon2", "lat1", "lat2"]
+                                                        }
+                                                }
+                                        },
+                                        {
+                                                "type": "object",
+                                                "properties":{
+                                                        "polygon": {
+                                                                "description": "POLYGON lon... lat...",
+                                                                "type": "object",
+                                                                "properties":{
+                                                                        "lon": {
+                                                                                "type": "array",
+                                                                                "items": "number"
+                                                                        },
+                                                                        "lat": {
+                                                                                "type": "array",
+                                                                                "items": "number"
+                                                                        }
+                                                                },
+                                                                "required": ["lon", "lat"]
+                                                        }
+                                                }
+                                        }
+                                ]
+                        }
+                },
+                "band": {
+                        "type": "object",
+                        "properties": {
+                                "system": {
+                                        "type": "array",
+                                        "items": {
+                                                "type": "string",
+                                                "enum": ["VELO_LSRK", "WAVE_Barycentric"]
+                                        }
+                                },
+                                "interval": {
+                                        "type": "array",
+                                        "items":{
+                                                "type": "number"
+                                        },
+                                        "minItems": 2,
+                                        "maxItems": 2
+                                }
+                        }
+                },
+                "time": {
+                        "type": "object",
+                        "properties": {
+                                "system": {"type": "string"},
+                                "interval": {
+                                        "type": "array",
+                                        "items":{
+                                                "type": "number"
+                                        },
+                                        "minItems": 2,
+                                        "maxItems": 2
+                                }
+                        }
+                },
+                "pol": {
+                        "type": "array",
+                        "items": {
+                                "type": "string",
+                                "enum": ["I", "Q", "U", "V", "RR", "LL", "RL", "LR", "XX", "YY", "XY", "YX"]
+                        },
+                        "minItems": 1
+                }
+        }
+}
+
diff --git a/data-access/engine/src/vlkb/vlkb.1 b/data-access/engine/src/vlkb/vlkb.1
new file mode 100644
index 0000000000000000000000000000000000000000..bd84b3c0cb9f34dde7baa168e46205024ccf055a
--- /dev/null
+++ b/data-access/engine/src/vlkb/vlkb.1
@@ -0,0 +1,22 @@
+.\"                                      Hey, EMACS: -*- nroff -*-
+.\" (C) Copyright 2023 ...
+.\"
+.TH vlkb 1 
+.SH NAME
+vlkb \- vlkb application
+.SH SYNOPSIS
+.B vlkb 
+.SH DESCRIPTION
+The 
+.B vlkb 
+is a utility to perform actions on a FITS files, like listing of its header, or calculating
+limits of coverage, add, modify or remove a card. List of actual sub-commands is printed by --help.
+.SH SEE ALSO
+.BR vlkb-obscore, vlkbd (1).
+.SH AUTHORS
+The
+.B vlkb 
+was written by 
+RBu <rbu@ia2.inaf.it>
+.PP
+This document was written by RBu <rbu@ia2.inaf.it> for Debian.
diff --git a/data-access/engine/src/vlkb/vlkb.changelog.Debian b/data-access/engine/src/vlkb/vlkb.changelog.Debian
new file mode 100644
index 0000000000000000000000000000000000000000..4f0759a651e37dec112cd03bb7c042a89524a131
--- /dev/null
+++ b/data-access/engine/src/vlkb/vlkb.changelog.Debian
@@ -0,0 +1,13 @@
+vlkb (1.4.8) stable; urgency=low
+
+  [ VLKB ]
+  * First release via deb and rpm packages.
+
+ -- INAF <RBu@ia2.inaf.com>  Thu,  23 Dec 2023 11:30:00 +0100 
+
+vlkb (1.4.7) stable; urgency=low
+
+  [ INAF ]
+  * Adds support for SODA parameters (http://ivoa.net/documents).
+
+ -- INAF <RBu@ia2.inaf.org>  Wed,   4 Oct 2023 11:00:00 +0100
diff --git a/data-access/engine/src/vlkb/vlkb.control b/data-access/engine/src/vlkb/vlkb.control
new file mode 100644
index 0000000000000000000000000000000000000000..f6b27bc4adfb9a9d0bb8cb74d614e6080c862b31
--- /dev/null
+++ b/data-access/engine/src/vlkb/vlkb.control
@@ -0,0 +1,8 @@
+Package: vlkb
+Version:
+Section: utils
+Priority: optional
+Architecture: all
+Maintainer: VLKB <RBu@ia2.vlkb.org>
+Description: This is vlkb utily to perform some operations on FITS-files. List of commands is printed in help.
+
diff --git a/data-access/engine/src/vlkb/vlkb.copyright b/data-access/engine/src/vlkb/vlkb.copyright
new file mode 100644
index 0000000000000000000000000000000000000000..a5c3f8ee0da297913abff540832419d1fdd0949f
--- /dev/null
+++ b/data-access/engine/src/vlkb/vlkb.copyright
@@ -0,0 +1,14 @@
+vlkb
+
+Copyright: 2023 INAF <ia2@inaf.com>
+
+2023-10-30
+
+The entire code base may be distributed under the terms of the GNU General
+Public License (GPL), which appears immediately below.  Alternatively, all
+of the source code as any code derived from that code may instead be
+distributed under the GNU Lesser General Public License (LGPL), at the
+choice of the distributor. The complete text of the LGPL appears at the
+bottom of this file.
+
+See /usr/share/common-licenses/(GPL|LGPL)
diff --git a/data-access/engine/src/vlkb/vlkb.datasets.conf b/data-access/engine/src/vlkb/vlkb.datasets.conf
new file mode 100644
index 0000000000000000000000000000000000000000..4535d3cf31f909bcf182fd80cdc988d206b2ede5
--- /dev/null
+++ b/data-access/engine/src/vlkb/vlkb.datasets.conf
@@ -0,0 +1,19 @@
+
+# path to generated cutouts
+fits_path_cutouts=/srv/vlkb/cutouts-FITSDB
+fits_url_cutouts=https://vlkb-devel.ia2.inaf.it:8443/CONTEXT_ROOT/cutouts
+#fits_url_cutouts=http://localhost:8080/CONTEXT_ROOT/cutouts
+
+# original datasets
+
+# root of path for local access
+fits_path_surveys=/srv/vlkb/surveys-FITSDB
+# root of url for remote access
+fits_url_surveys=https://vlkb-devel.ia2.inaf.it:8443/CONTEXT_ROOT/surveys
+#fits_url_surveys=http://ia2-vo.oats.inaf.it/vialactea-devel
+
+# obs_publisher_did
+ivoid_authority=ia2.inaf.it
+ivoid_resource_key=vlkb/dsetdesc
+
+
diff --git a/data-access/engine/src/vlkb/vlkb.gdb b/data-access/engine/src/vlkb/vlkb.gdb
new file mode 100644
index 0000000000000000000000000000000000000000..7bf5404d81922f61bd1aa2c4967a606dad30efe4
--- /dev/null
+++ b/data-access/engine/src/vlkb/vlkb.gdb
@@ -0,0 +1,14 @@
+set debuginfod enabled on
+b imcopy2
+# primary HDU only:
+#run imcopy /srv/ska/surveys/ASK-WALLABY/cutout-574594-imagecube-42178.fits 0 "[1:10 1:20 1:1 1:200]" /tmp > cut.fits
+# has extensions:
+run imcopy /srv/ska/surveys/MKT-MGCLS/Abell_194_IPoln.fits 0 "[1:5000 1:5000 1:15 1:1]" /tmp > cut.fits
+n
+b fits_open_file
+b fits_select_image_section
+
+#s
+#b cfileio.c:1210
+
+
diff --git a/data-access/engine/src/vlkb/vlkb.spec b/data-access/engine/src/vlkb/vlkb.spec
new file mode 100644
index 0000000000000000000000000000000000000000..9cb6dd1c10536fdfaaffede94b33c3d72073b945
--- /dev/null
+++ b/data-access/engine/src/vlkb/vlkb.spec
@@ -0,0 +1,34 @@
+Name: vlkb
+Version: %{version}
+Release: 1%{?dist}
+Summary: vlkb
+Source1: vlkb
+License: GPLv3+
+URL: http://ia2.inaf.it
+BuildRequires: gcc >= 3.2.0, glibc-devel >= 2.17, libstdc++-devel >= 4.8, ast-devel >= 7.3.4, cfitsio-devel >= 3.370, libcsv-devel >= 3.0
+Requires: glibc >= 2.17, libstdc++ >= 4.8, ast >= 7.3.4, cfitsio >= 3.370, libcsv >= 3.0 
+
+%description
+This utility ia part of a VLKB-suite (ViaLactea Knowledge Base) to manipulate or calculate information abount
+coordinates systems in a FITS-file. Set of actual commands is printed in help.
+
+
+%prep
+
+%build
+
+
+%install
+mkdir -p %{buildroot}%{_prefix}/bin
+install -m 755 %{SOURCE1} %{buildroot}%{_prefix}/bin
+%files
+%{_bindir}/vlkb
+
+
+%post
+
+%postun
+
+
+%changelog
+                
diff --git a/data-access/engine/src/vlkbd/Makefile b/data-access/engine/src/vlkbd/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..ca69708c62a074f6f34abcd0acbb34ac515a006f
--- /dev/null
+++ b/data-access/engine/src/vlkbd/Makefile
@@ -0,0 +1,83 @@
+#================================================================================
+EXEC_NAME = vlkbd
+VERSION ?= $(shell git describe)
+BUILD_ ?= $(shell LANG=us_US date; hostname)
+#================================================================================
+DEPS_DIR := ../common ../../ext/aria-csv ../../ext/nlohmann-json
+DEPS_INC := $(foreach d, $(DEPS_DIR), $d/include)
+DEPS_LIB := $(foreach d, $(DEPS_DIR), $d/lib)
+#================================================================================
+COMMON_DIR=../common
+COMMON_LIB = $(COMMON_DIR)/lib/libvlkbcommon.a
+#================================================================================
+INC_DIR=src $(DEPS_INC) $(COMMON_DIR)/include  ../../ext \
+	/usr/include/cfitsio \
+	/usr/include/postgresql
+LIB_DIR= $(DEPS_LIB) $(COMMON_DIR)/lib /usr/lib64/ast /usr/local/lib
+#================================================================================
+CC=g++
+CXX_DEBUG_FLAGS=-g -DFDB_DEBUG
+CXX_RELEASE_FLAGS=-O2
+CXX_DEFAULT_FLAGS=-c -x c++ -std=c++11 -fPIC -Wall -Wextra -Wconversion -fno-common -DVERSIONSTR='"$(VERSION)"' -DBUILD='"$(BUILD_)"'
+# FIXME: -last_pal missing in some builds (not realluy needed only for linker)
+LDFLAGS = -Wall -lvlkbcommon -lpq -lpqxx -lcfitsio -lrabbitmq -last -last_grf_2.0 -last_grf_3.2 -last_grf_5.6 -last_grf3d -last_err -pthread -lstdc++ -lm
+INC_PARM=$(foreach d, $(INC_DIR), -I$d)
+LIB_PARM=$(foreach d, $(LIB_DIR), -L$d)
+#================================================================================
+SRC_DIR=src
+OBJ_DIR=obj
+BIN_DIR=bin
+#================================================================================
+EXECUTABLE	:= $(BIN_DIR)/$(EXEC_NAME)
+CPP_FILES 	:= $(wildcard $(SRC_DIR)/*.cpp)
+OBJ_FILES 	:= $(addprefix $(OBJ_DIR)/,$(notdir $(CPP_FILES:.cpp=.o)))
+#================================================================================
+NPROCS = $(shell grep -c 'processor' /proc/cpuinfo)
+MAKEFLAGS += -j$(NPROCS)
+#================================================================================
+
+.PHONY: all
+all : debug
+
+.PHONY: release
+release: CXXFLAGS+=$(CXX_RELEASE_FLAGS) $(CXX_DEFAULT_FLAGS)
+release: $(EXECUTABLE)
+
+.PHONY: debug
+debug: CXXFLAGS+=$(CXX_DEBUG_FLAGS) $(CXX_DEFAULT_FLAGS)
+debug: $(EXECUTABLE)
+
+$(EXECUTABLE) : $(COMMON_LIB) makedir $(OBJ_FILES)
+	$(CC) $(OBJ_FILES) $(LIB_PARM) $(LDFLAGS) -o $@
+
+$(OBJ_DIR)/%.o: $(SRC_DIR)/%.cpp
+	        $(CC) $(CXXFLAGS) $(INC_PARM) -o $@ $<
+
+.PHONY: $(COMMON_LIB)
+$(COMMON_LIB) :
+	   make -C $(COMMON_DIR)
+
+.PHONY: makedir
+makedir:
+	-mkdir -p $(OBJ_DIR) $(BIN_DIR)
+
+
+.PHONY: clean
+clean :
+	-rm -fr $(OBJ_DIR) $(BIN_DIR)
+
+
+
+.PHONY: test
+test :
+	@tabs 20
+	@echo -e "EXEC_NAME:\t"  $(EXEC_NAME)
+	@echo -e "VERSION:\t"  $(VERSION)
+	@echo -e "CPP_FILES:\t"  $(CPP_FILES)
+	@echo -e "OBJ_FILES:\t"  $(OBJ_FILES)
+	@echo -e "C_FILES:\t"  $(C_FILES)
+	@echo -e "C_OBJ_FILES:\t"  $(C_OBJ_FILES)
+	@echo -e "INC_PARM:\t"  $(INC_PARM)
+	@echo -e "LIB_PARM:\t"  $(LIB_PARM)
+
+
diff --git a/data-access/engine/src/vlkbd/src/config.cpp b/data-access/engine/src/vlkbd/src/config.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..7f6b41778a69436ae1e2c6e71a18b1407c960435
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/config.cpp
@@ -0,0 +1,76 @@
+
+#include "io.hpp"
+#include "config.hpp"
+
+#include <iostream>
+#include <fstream>
+#include <sstream>
+#include <map>
+
+/*/ C
+#include <stdio.h>
+#include <stdlib.h> // atoi needed
+#include <string.h>
+*/
+
+using namespace std;
+
+void config::read_config(const std::string & settings_path)
+{
+   std::ifstream settings_file(settings_path);
+   std::string line;
+
+   LOG_STREAM << "config::read_config()" << endl;
+
+   if (settings_file.fail())
+   {
+      LOG_STREAM << "config file does not exist. Default options used." << endl;
+
+      return;
+   }
+
+
+   while (std::getline(settings_file, line))
+   {
+      std::istringstream iss(line);
+      std::string id, eq, val;
+
+      if (std::getline(iss, id, '='))
+      {
+         if (std::getline(iss, val))
+         {
+            if (m_settings.find(id) != m_settings.end())
+            {
+               if (val.empty())
+               {
+                  LOG_STREAM << "config " << id.c_str()
+                     << " is empty. Keeping default " << m_settings[id].c_str() << endl;
+               }
+               else
+               {
+                  m_settings[id] = val;
+                  LOG_STREAM << "config " << id.c_str()
+                     <<" read as " << m_settings[id].c_str() << endl;
+               }
+            }
+            else
+            {
+               //Not present in map
+               LOG_STREAM << "Setting "<< id.c_str() << " not defined, ignoring it" << endl;
+               continue;
+            }
+         }
+         else
+         {
+            // Comment line, skiping it
+            continue;
+         }
+      }
+      else
+      {
+         //Empty line, skipping it
+         continue;
+      }
+   }
+}
+
diff --git a/data-access/engine/src/vlkbd/src/config.hpp b/data-access/engine/src/vlkbd/src/config.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..c6f9e98c72a938ba647912e2d369a53bb44cf082
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/config.hpp
@@ -0,0 +1,114 @@
+
+#ifndef CONFIG_HPP
+#define CONFIG_HPP
+
+#include <string>
+#include <map>
+
+
+class config
+{
+   public:
+
+      void read_config(const std::string & settings_path);
+
+      std::string getLogDir() const      {return m_settings.at(log_dir);}
+      std::string getLogFileName() const {return m_settings.at(log_filename);}
+
+//     std::string getAuthority() const          {return m_settings.at(ivoid_authority);}
+//      std::string getResourceKey() const        {return m_settings.at(ivoid_resource_key);}
+//      std::string getObsCorePublisher() const   {return std::string{"ivo://"} + getAuthority() + std::string{"/"} + getResourceKey();}
+//      std::string getObsCoreAccessFormat() const {return m_settings.at(obscore_access_format);}
+      
+//      std::string getRemoteFitsDir() const {return m_settings.at(fits_url);}
+      std::string getFitsDir() const       {return m_settings.at(fits_dir);}
+      std::string getFitsCutDir() const    {return m_settings.at(fits_cutdir);}
+/*
+      std::string getDbms() const       {return m_settings.at(db_dbms);}
+      std::string getDbHostName() const {return m_settings.at(db_host_name);}
+      std::string getDbPort() const     {return m_settings.at(db_port);}
+      std::string getDbSchema() const   {return m_settings.at(db_schema);}
+      std::string getDbName() const     {return m_settings.at(db_name);}
+      std::string getDbUserName() const {return m_settings.at(db_user_name);}
+      std::string getDbPassword() const {return m_settings.at(db_password);}
+
+      std::string getDbUri(bool with_password = false) const
+      {return
+	      m_settings.at(db_dbms)
+		      + "://"
+		      + m_settings.at(db_user_name)
+		      + (with_password ? ":"+m_settings.at(db_password)  : "")
+		      + "@"
+		      + m_settings.at(db_host_name)
+		      + ":"
+		      + m_settings.at(db_port)
+		      + "/"
+		      + m_settings.at(db_name);
+      }
+
+      std::string getDbPostgresConnectString(bool with_password = false) const
+      {return
+	      "dbname = "  + m_settings.at(db_name)
+		      + " port = " + m_settings.at(db_port)
+		      + " host = " + m_settings.at(db_host_name)
+		      + " user = " + m_settings.at(db_user_name)
+		      + (with_password ? " password = " + m_settings.at(db_password)  : "")
+		      + " options=\'-c search_path=" + m_settings.at(db_schema) + "\'";
+      }
+*/
+   private:
+      std::string value(std::string key) {return m_settings.at(key);}
+
+      const std::string fits_dir{"fits_path_surveys"};
+      const std::string fits_cutdir{"fits_path_cutouts"};
+
+      const std::string log_dir{"log_dir"};
+      const std::string log_filename{"log_filename"};
+
+//      const std::string fits_url{"fits_url_surveys"};
+/*      const std::string ivoid_authority{"ivoid_authority"};
+      const std::string ivoid_resource_key{"ivoid_resource_key"};
+      const std::string obscore_access_format{"obscore_access_format"};
+
+      const std::string db_dbms{"db_dbms"};
+      const std::string db_host_name{"db_host_name"};
+      const std::string db_port{"db_port"};
+      const std::string db_user_name{"db_user_name"};
+      const std::string db_password{"db_password"};
+      const std::string db_name{"db_name"};
+      const std::string db_schema{"db_schema"};
+*/
+      //-------------------------------------------------
+      // defaults
+      //-------------------------------------------------
+
+      const std::string empty_string;
+
+      std::map<const std::string, std::string> m_settings 
+      {
+         {fits_dir, "/srv/surveys"},
+         {fits_cutdir, "/srv/cutouts"},
+
+         {log_dir, "/tmp"},
+         {log_filename, "vlkbd.log"},
+/*
+            {ivoid_authority, 	empty_string},
+            {ivoid_resource_key, 	empty_string},
+            {obscore_access_format, 	"application/fits"},
+
+            {fits_url, 	empty_string},
+
+            {db_dbms, 	empty_string},
+            {db_host_name,	empty_string},
+            {db_port, 	empty_string},
+            {db_user_name, 	empty_string},
+            {db_password, 	empty_string},
+            {db_name, 	empty_string},
+            {db_schema, 	empty_string}
+*/      
+      };
+};
+
+
+#endif
+
diff --git a/data-access/engine/src/vlkbd/src/json_reply.cpp b/data-access/engine/src/vlkbd/src/json_reply.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..8ceb02e0faa822c4cedf6be8eb9d3a18e4e94690
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/json_reply.cpp
@@ -0,0 +1,17 @@
+
+#include "json_reply.hpp"
+#include "json.hpp"
+#include "io.hpp"
+
+#include <vector>
+#include <string>
+#include <stdexcept>
+
+#include <stdlib.h>
+#include <string.h>
+#include <sched.h> // sched_getcpu()
+
+using namespace std;
+
+const string ENGINE_VERSION{"engine version " + string(VERSIONSTR) + " " + string(BUILD) + " on CPU#"+to_string(sched_getcpu())};
+
diff --git a/data-access/engine/src/vlkbd/src/json_reply.hpp b/data-access/engine/src/vlkbd/src/json_reply.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..53c7248dd0e315471c5396cc0904776d89fda451
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/json_reply.hpp
@@ -0,0 +1,26 @@
+
+#include "cutout.hpp"
+#include "cutout_nljson.hpp"
+#include "mcutout.hpp"
+#include "mcutout_nljson.hpp"
+#include "json.hpp"
+
+#include <vector>
+#include <string>
+
+/* All nlohmann-json exception are json::exception <- std::exception.
+ * So let them be caught by std::excpetion as 'Internal errors' in rpc-call's infinite loop,
+ * assuming all API syntactic errors were caught in servlet API parser */
+
+class json_reply
+{
+   public:
+      json_reply() {j = nlohmann::json::object();};
+      void put_cutout_result(cutout_res_s res) { j = res; };
+      void put_mcutout_result(mcutout_res_s res) { j = res; };
+      std::string json_str() { return j.dump();};
+
+   private:
+      nlohmann::json j;
+};
+
diff --git a/data-access/engine/src/vlkbd/src/json_service_call.cpp b/data-access/engine/src/vlkbd/src/json_service_call.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..647ea9eba1bd83f58ff16cd04c794817fa721662
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/json_service_call.cpp
@@ -0,0 +1,159 @@
+
+// convert parameters to/from JSON and call vlkb services
+
+// NOTE merge_id
+// jobId exists if we run under UWS (MERGE1,2,3)
+// create merge-id = amqp-queuename + jobId 
+// such id is uniq when running MERGE parallel and more deploys
+// of vlkb-dtasets present against the same amqp-broker
+
+#include "json_service_call.hpp"
+#include "config.hpp"
+#include "io.hpp"
+#include "cutout.hpp"
+#include "cutout_nljson.hpp"
+#include "mcutout.hpp"
+#include "mcutout_nljson.hpp"
+#include "json_request.hpp"
+#include "json_reply.hpp"
+
+#include "fitsfiles.hpp" // calc_nullvals
+
+#include <stdexcept>
+#include <vector>
+#include <string>
+
+
+using namespace std;
+
+
+string to_string(service_error serr)
+{
+   string str;
+   switch(serr)
+   {
+      case service_error::INVALID_PARAM: return "INVALID_PARAM"; break;
+      case service_error::SYSTEM_ERROR:  return "SYSTEM_ERROR"; break;
+   }
+
+   LOG_STREAM << string(__FILE__) << ":" << to_string(__LINE__) << "unrecognized value in service_error type" << endl;
+
+   return str;
+}
+
+
+
+string service_exception(service_error error, const string what)
+{
+   LOG_trace(__func__);
+
+   json jreply;
+
+   switch(error)
+   {
+      case service_error::INVALID_PARAM:
+         jreply["exception"] = { {"type","INVALID_PARAM"}, {"msg",what} };
+         break;
+      case service_error::SYSTEM_ERROR:
+         jreply["exception"] = { {"type","SYSTEM_ERROR"}, {"msg",what} };
+         break;
+   }
+
+   LOG_STREAM << to_string(error) + ": " + what << endl;
+
+   return jreply.dump();
+}
+
+
+
+string service_call(string request_json, string queuename, config conf)
+{
+   LOG_trace(__func__);
+
+   const string setts_fitsdir{conf.getFitsDir()};
+   const string setts_fitscutdir{conf.getFitsCutDir()};
+
+   LOG_STREAM << request_json << endl;
+
+   json_request req(request_json);
+
+   json_reply reply;
+
+   if(req.is_subimg())
+   {
+      cutout_res_s cutres = do_cutout_file(
+            req.img_pathname(), req.img_hdunum(),
+            req.get_pos(), req.get_band(), req.get_time(), req.get_pol(),
+            req.count_null_values(),
+            req.extra_cards(),
+            setts_fitsdir,
+            setts_fitscutdir);
+
+      reply.put_cutout_result(cutres);
+   }
+   else if(req.is_mcutout())
+   {
+      struct mcutout_res_s mres = mcutout(req.cut_params(), setts_fitsdir, setts_fitscutdir);
+
+      mres.tgz_filename = setts_fitscutdir + "/" + mres.tgz_filename;
+      mres.filesize     = fitsfiles::fileSize(mres.tgz_filename);
+      reply.put_mcutout_result(mres);
+   }
+   else if(req.is_mergefiles())
+   {
+      string mergedfile_pathname;
+
+      unsigned long fsize = xmergefiles(
+            req.files_to_merge(),
+            req.dimensionality(),
+            setts_fitscutdir, setts_fitscutdir,
+            mergedfile_pathname);
+
+      cutout_res_s cutres{ fsize, mergedfile_pathname, {-1.0, 0, 0} };
+      reply.put_cutout_result(cutres);
+   }
+   else if(req.is_mergefiles_common_header())
+   {
+      string merge_id(queuename + "_" + req.merge_id());
+
+      xmergefiles_common_header(
+            merge_id,
+            req.files_to_merge(),
+            req.dimensionality(),//FIXME convert to int: dimensionslity
+            setts_fitscutdir, setts_fitscutdir);
+   }
+   else if(req.is_mergefiles_reproject())
+   {
+      string merge_id(queuename + "_" + req.merge_id());
+
+      xmergefiles_reproject(
+            merge_id,
+            req.fitsfilename(),
+            req.dimensionality(),//FIXME convert to int: dimensionslity
+            setts_fitscutdir, setts_fitscutdir);
+   }
+   else if(req.is_mergefiles_add_reprojected())
+   {
+      string merge_id(queuename + "_" + req.merge_id());
+
+      string mergedfile_pathname;
+
+      unsigned long fsize = xmergefiles_add_reprojected(
+            merge_id,
+            req.dimensionality(),
+            setts_fitscutdir, setts_fitscutdir,
+            mergedfile_pathname);
+
+      cutout_res_s cutres{ fsize, mergedfile_pathname, {-1.0, 0, 0} };
+      reply.put_cutout_result(cutres);
+   }
+   else
+   {
+      throw std::runtime_error("unrecognized vlkb service");
+   }
+
+   return reply.json_str();
+}
+
+
+
diff --git a/data-access/engine/src/vlkbd/src/json_service_call.hpp b/data-access/engine/src/vlkbd/src/json_service_call.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..9e9996dfcc6f0888cadd61bd757876bcc46b3f7f
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/json_service_call.hpp
@@ -0,0 +1,21 @@
+
+// in service_call():
+// queuename serves as identifier for merge working-dirs, to distinguish
+// more instances of VLKB running on the same broker: each instance
+// needs separate queue for rpc
+// FIXME qname should come from config file
+
+
+#ifndef JSON_SERVICE_CALL_HPP
+#define JSON_SERVICE_CALL_HPP
+
+#include "config.hpp"
+#include <string>
+
+std::string service_call(std::string request_json, std::string queuename, config conf);
+enum class service_error {INVALID_PARAM, SYSTEM_ERROR};
+
+std::string service_exception(enum service_error error, std::string what);
+
+#endif
+
diff --git a/data-access/engine/src/vlkbd/src/rpc_amqp.cpp b/data-access/engine/src/vlkbd/src/rpc_amqp.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..09c02a3491945632565a4e3ec1a28d176199d961
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/rpc_amqp.cpp
@@ -0,0 +1,434 @@
+
+// RPC over AMQP
+
+#include "rpc_amqp.hpp"
+#include "config.hpp"
+
+#include "rpc_amqp_utils.hpp"
+#include "json_service_call.hpp"
+
+#include <stdexcept>
+#include <string>
+
+#include <stdio.h>
+#include <syslog.h>
+
+#include <amqp_tcp_socket.h>
+#include <amqp.h>
+#include <amqp_framing.h>
+
+#include "io.hpp"
+
+using namespace std;
+
+// error handling
+
+
+void throw_ex_on_amqp_error(amqp_rpc_reply_t rc, amqp_connection_state_t conn, amqp_channel_t channel, char const *context)
+{
+
+   std::string ct(context);
+
+   if(rc.reply_type != AMQP_RESPONSE_NORMAL)
+   {
+      amqp_rpc_reply_t rc_ch_close = amqp_channel_close(conn, channel, AMQP_REPLY_SUCCESS);
+      if(rc_ch_close.reply_type != AMQP_RESPONSE_NORMAL)
+         throw std::runtime_error("cannot close channel after unsuccessful " + ct);
+
+      amqp_rpc_reply_t rc_conn_close = amqp_connection_close(conn, AMQP_REPLY_SUCCESS);
+      if(rc_conn_close.reply_type != AMQP_RESPONSE_NORMAL)
+         throw std::runtime_error("cannot close connection after unsuccessful " + ct);
+
+      if(AMQP_STATUS_OK != amqp_destroy_connection(conn))
+         throw std::runtime_error("cannot end connection after unsuccessful " + ct);
+      else
+         throw std::runtime_error(ct + " failed");
+   }
+}
+
+
+
+void syslog_on_amqp_error(amqp_rpc_reply_t x, char const *context)
+{
+   switch (x.reply_type) {
+      case AMQP_RESPONSE_NORMAL:
+         return;
+
+      case AMQP_RESPONSE_NONE:
+         syslog(LOG_ERR, "%s: missing RPC reply type!\n", context);
+         break;
+
+      case AMQP_RESPONSE_LIBRARY_EXCEPTION:
+         syslog(LOG_ERR, "%s: %s\n", context, amqp_error_string2(x.library_error));
+         break;
+
+      case AMQP_RESPONSE_SERVER_EXCEPTION:
+         switch (x.reply.id) {
+            case AMQP_CONNECTION_CLOSE_METHOD:
+               {
+                  amqp_connection_close_t *m =
+                     (amqp_connection_close_t *)x.reply.decoded;
+                  syslog(LOG_ERR, "%s: server connection error %uh, message: %.*s\n",
+                        context, m->reply_code, (int)m->reply_text.len,
+                        (char *)m->reply_text.bytes);
+                  break;
+               }
+            case AMQP_CHANNEL_CLOSE_METHOD:
+               {
+                  amqp_channel_close_t *m = (amqp_channel_close_t *)x.reply.decoded;
+                  syslog(LOG_ERR, "%s: server channel error %uh, message: %.*s\n",
+                        context, m->reply_code, (int)m->reply_text.len,
+                        (char *)m->reply_text.bytes);
+                  break;
+               }
+            default:
+               syslog(LOG_ERR, "%s: unknown server error, method id 0x%08X\n",
+                     context, x.reply.id);
+               break;
+         }
+         break;
+   }
+}
+
+
+
+// AMQP RPC
+//
+// establish connection to RabbitMQ-broker on "conn" and channel=1
+// use this connection [conn,channel] to:
+// * create queue where Java-vlkb-client will put messages (queuename must match routingKey of Java-client config file)
+// * bind the queue to pre-defined exchange "amq.direct"
+// * ask the broker to start basic-consumer on that queue
+// WAIT: Consume message from "conn"
+// Create new reply-message with CorrdId from received message
+// * publish the reply-msg to reply-to queue
+// return to WAIT: ... loop forever
+
+amqp_connection_state_t login_to_broker(const string user_name, const string password,
+      const string hostname, int port)
+{
+   // allocate new conn and initialize
+   // NOTE: must destroy conn at exit
+
+ amqp_connection_state_t  conn = amqp_new_connection();
+   if(conn == NULL)
+      throw std::runtime_error("cannot create new connection");
+
+
+   { // open new TCP-socket and store in conn
+
+      amqp_socket_t *socket = NULL;
+      socket = amqp_tcp_socket_new(conn);
+      if (socket == NULL)
+      {
+         if(AMQP_STATUS_OK != amqp_destroy_connection(conn))
+            throw std::runtime_error("cannot end connection after unsuccessful new TCP socket");
+         else
+            throw std::runtime_error("error creating TCP socket");
+      }
+      int status;
+      status = amqp_socket_open(socket, hostname.c_str(), port);
+      if (status != 0)
+      {
+         if(AMQP_STATUS_OK != amqp_destroy_connection(conn))
+            throw std::runtime_error("cannot end connection after unsuccessful socket open");
+         else 
+            throw std::runtime_error("error opening TCP socket");// FIXME add status to msg
+      }
+   }
+
+
+   amqp_rpc_reply_t rc;
+   rc = amqp_login(conn, "/", 0, 131072, 0, AMQP_SASL_METHOD_PLAIN, user_name.c_str(), password.c_str());
+   if(rc.reply_type != AMQP_RESPONSE_NORMAL)
+   {
+      amqp_rpc_reply_t rc_close = amqp_connection_close(conn, AMQP_REPLY_SUCCESS);
+      if(rc_close.reply_type != AMQP_RESPONSE_NORMAL)
+         throw std::runtime_error("cannot close connection after unsuccessful amqp login");
+      else if(AMQP_STATUS_OK != amqp_destroy_connection(conn))
+         throw std::runtime_error("cannot end connection after unsuccessful amqp login");
+      else
+         throw std::runtime_error("amqp_login failed");
+   }
+
+   return conn;
+}
+
+
+
+// RPC-loop
+
+
+
+int channel_open(amqp_connection_state_t conn, amqp_channel_t channel)
+{
+   amqp_channel_open(conn, channel);
+   amqp_rpc_reply_t rep = amqp_get_rpc_reply(conn);
+
+   return (rep.reply_type != AMQP_RESPONSE_NORMAL);
+}
+
+
+
+void declare_nondurable_autodelete_queue(
+      amqp_connection_state_t conn, amqp_channel_t channel,
+      amqp_bytes_t queuename)
+{
+   amqp_queue_declare(conn, channel,
+         queuename,
+         0, // 'passive' guarantees that this client sees the queue which was created already
+         0, // 'durable' queue survives broker restarts
+         0, // 'exclusive' to current connection (queue deleted when conn closes)
+         1, // 'auto_delete' the queue when not used
+         amqp_empty_table); // amqp_table_t arguments specific for AMQP broker implementation (none in RabbitMQ)
+}
+
+
+
+// start a queue consumer (e.g. start delivering msgs from the queue to this client)
+// broker-implementation should support at least 16 consumers per queue
+void start_basic_consumer_noack(
+      amqp_connection_state_t conn, amqp_channel_t channel,
+      amqp_bytes_t queuename)
+{
+   amqp_basic_consume(conn, channel,
+         queuename,
+         amqp_empty_bytes, // consumer_tag amqp_bytes_t: consumer-identifier (if empty, server generates a tag)
+         0,  // no_local amqp_boolean_t: broker will not send msgs to connection which published them
+         1,  // no_ack amqp_boolean_t: broker does not expect acknowledgement for delivered msgs
+         0,  // exclusive  amqp_boolean_t : only this consumer can access the queue
+         amqp_empty_table); // arguments amqp_table_t: implementation specific args (not used in RabbitMQ)
+} 
+
+
+
+int consume_message_wait_forever(amqp_connection_state_t conn, amqp_envelope_t *envelope)
+{
+   // release memory associated with all channels
+   amqp_maybe_release_buffers(conn);
+
+   amqp_rpc_reply_t res = amqp_consume_message(conn,
+         envelope, // message in envelope
+         NULL,     // timeout (struct *)
+         0);       // flags (int) AMQP_UNUSED
+
+   if (AMQP_RESPONSE_NORMAL != res.reply_type)
+   {
+      syslog_on_amqp_error(res, "amqp_consume_message");
+   }
+
+   return (AMQP_RESPONSE_NORMAL != res.reply_type);
+}
+
+
+
+void basic_publish_on_queue_or_drop(amqp_connection_state_t conn, amqp_channel_t channel,
+      amqp_bytes_t queuename,
+      amqp_bytes_t correlation_id,
+      const char * msg_buff)
+{
+   amqp_basic_properties_t props;
+
+   props._flags =
+      AMQP_BASIC_CONTENT_TYPE_FLAG  |
+      AMQP_BASIC_DELIVERY_MODE_FLAG |
+      AMQP_BASIC_CORRELATION_ID_FLAG;
+   props.content_type   = amqp_cstring_bytes("application/json");// FIXME make sure encoding is UTF-8
+   //props.content_type   = amqp_cstring_bytes("text/plain");
+   props.delivery_mode  = 2;
+   // 1: non-persistent
+   // 2: persistent (delivered even if broker re-boots - msg held on hard disk)
+   props.correlation_id = correlation_id;
+
+   int rc = amqp_basic_publish(conn, channel,
+         amqp_empty_bytes, // exchange amqp_bytes_t: empty = default-exchange
+         queuename,        // routingKey := queuename  amqp_bytes_t
+         0, // mandatory amqp_boolean_t 0: drop the msg if cannot be routed (1: return msg)
+         0, // immediate amqp_boolean_t 0: queue the msg if cannot be routed immediately (1: -"-)
+         &props, // amqp_basic_properties_t
+         amqp_cstring_bytes(msg_buff)); // body
+
+   if (rc < 0)
+   {
+      syslog(LOG_ERR, "%s: basic publish failed with %uh, message: %s\n",__func__, rc, amqp_error_string2(rc));
+   }
+}
+
+
+
+// run RPC-loop
+// even if error happens on consume-request or publish-response
+void rpc_loop_forever(
+      amqp_connection_state_t conn, amqp_channel_t channel,
+      const string queuename,
+      const string settings_pathname)
+{
+   if(channel_open(conn, channel))
+   {
+      amqp_rpc_reply_t rep_close = amqp_connection_close(conn, AMQP_REPLY_SUCCESS);
+      if(rep_close.reply_type != AMQP_RESPONSE_NORMAL)
+         throw std::runtime_error("cannot close connection after unsuccessful channel open");
+      else if(AMQP_STATUS_OK != amqp_destroy_connection(conn))
+         throw std::runtime_error("cannot end connection after unsuccessful channel open");
+      else
+         throw std::runtime_error("channel open failed");
+   }
+
+
+   declare_nondurable_autodelete_queue(conn, channel, amqp_cstring_bytes(queuename.c_str()));
+   throw_ex_on_amqp_error(amqp_get_rpc_reply(conn), conn, channel, "amqp queue declare");
+
+   start_basic_consumer_noack(conn, channel, amqp_cstring_bytes(queuename.c_str()));
+   throw_ex_on_amqp_error(amqp_get_rpc_reply(conn), conn, channel, "amqp basic consume");
+
+   syslog(LOG_INFO,"AMQP initialized. Run RPC loop.");
+
+   config conf;
+   conf.read_config(settings_pathname);
+   syslog(LOG_INFO, string("Will log to " + conf.getLogDir()).c_str());
+
+   for (;;)
+   {
+      amqp_envelope_t envelope;
+
+      if(consume_message_wait_forever(conn, &envelope))
+      {
+         continue;
+      }
+
+      string request_json((const char*)envelope.message.body.bytes, envelope.message.body.len);
+
+      // RPC call
+
+      LOG_open(conf.getLogDir(), conf.getLogFileName());
+
+      string reply_json;
+      try
+      {
+         reply_json = service_call(request_json, queuename, conf);
+      }
+      catch(const invalid_argument& ex)
+      {
+         reply_json = service_exception(service_error::INVALID_PARAM, ex.what());
+      }
+      catch(const exception& ex)
+      {
+         reply_json = service_exception(service_error::SYSTEM_ERROR, ex.what());
+      }
+
+      LOG_close();
+
+
+      basic_publish_on_queue_or_drop(conn, channel,
+            envelope.message.properties.reply_to,
+            envelope.message.properties.correlation_id,
+            reply_json.c_str());
+
+      amqp_destroy_envelope(&envelope);
+   }
+
+   // Function never returns. Terminate with signal.
+}
+
+
+
+void do_cleanup(amqp_connection_state_t conn, amqp_channel_t channel)
+{
+   die_on_amqp_error(amqp_channel_close(conn, channel, AMQP_REPLY_SUCCESS), "Closing channel");
+   die_on_amqp_error(amqp_connection_close(conn, AMQP_REPLY_SUCCESS), "Closing connection");
+   die_on_error(amqp_destroy_connection(conn), "Ending connection");
+
+   LOG_close();
+}
+
+
+
+// interfaces
+
+// global to make accessible from signal_handler FIXME
+amqp_connection_state_t conn;
+amqp_channel_t channel;
+
+
+void rpc_run(const string user_name, const string password,
+      const string hostname, int port,
+      const string rpc_queuename,
+      const string settings_pathname)
+{
+   conn = login_to_broker(user_name, password, hostname, port);
+
+   channel = 1; // only single AMQP-channel per connection needed, use channel no. 1
+   rpc_loop_forever(conn, channel, rpc_queuename, settings_pathname); // func never returns
+}
+
+
+void rpc_cleanup(void)
+{
+   do_cleanup(conn, channel);
+}
+
+
+
+
+
+
+
+
+
+
+///////////////////////////////////////////////////////////////////////////
+///////////////////////////////////////////////////////////////////////////
+///////////////////////////////////////////////////////////////////////////
+// NOTE:
+// this was in rpc_run_loop AFTER queue_declare and BEFORE basic_consume :
+#ifdef usedefaultexchange
+// bind queue to exchange
+
+amqp_queue_bind(conn, channel,
+      queuename,
+      amqp_cstring_bytes("amq.direct"), // exchange
+      queuename,                        // routingKey := queuename
+      amqp_empty_table);                // empty arguments
+throw_ex_on_amqp_error(amqp_get_rpc_reply(conn), "amqp queue bind");
+
+// better load balancing
+
+amqp_basic_qos_ok_t * qok = amqp_basic_qos(conn, channel,
+      0, // prefetch_size   uint32_t
+      1, // prefetch_count  uint16_t
+      0);// global    amqp_boolean_t :
+// =0 prefetch_count applies seperatly to each consumer
+// =1 prefetch_count applices to all consumers
+throw_ex_on_amqp_error(amqp_get_rpc_reply(conn), "amqp basic QoS");
+#endif
+// ask the broker to start a basic-consumer on queue "queuename"
+// serves all channels in connection (envelope.channel) -> always reply to
+// queue whos name is in reply-to field (no need to ditinguish channels,
+// reply-to queues were created by that channel)
+
+// no_ack affects message consume from queue:
+// broker will remove msg right after delivery without waiting for confirmation from connected peer
+// improves performance on expense of reliability
+
+
+
+/* util 
+void print_envelope_if(int condition, amqp_envelope_t * envelope)
+{
+   if(condition){
+      printf("Delivery %u, exchange %.*s routingkey %.*s\n",
+            (unsigned) envelope->delivery_tag,
+            (int) envelope->exchange.len, (char *) envelope->exchange.bytes,
+            (int) envelope->routing_key.len, (char *) envelope->routing_key.bytes);
+
+      if (envelope->message.properties._flags & AMQP_BASIC_CONTENT_TYPE_FLAG) {
+         printf("Content-type: %.*s\n",
+               (int) envelope->message.properties.content_type.len,
+               (char *) envelope->message.properties.content_type.bytes);
+      }
+      printf("----\n");
+      //        amqp_dump(envelope->message.body.bytes, envelope->message.body.len);
+   }
+}
+*/
+
diff --git a/data-access/engine/src/vlkbd/src/rpc_amqp.hpp b/data-access/engine/src/vlkbd/src/rpc_amqp.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..42c8d91026565f6e583b2c60714bc813ba40668b
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/rpc_amqp.hpp
@@ -0,0 +1,16 @@
+
+#ifndef RPC_AMQP_HPP
+#define RPC_AMQP_HPP
+
+#include <string>
+
+void rpc_run(
+      const std::string user_name, const std::string password,
+      const std::string hostname, int port,
+      const std::string rpc_queuename,
+      const std::string settings_pathname);
+
+void rpc_cleanup(void);
+
+#endif
+
diff --git a/data-access/engine/src/vlkbd/src/rpc_amqp_utils.cpp b/data-access/engine/src/vlkbd/src/rpc_amqp_utils.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..dfc4422d5b71362a31c19fecd4e1d9ab8850b1c4
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/rpc_amqp_utils.cpp
@@ -0,0 +1,249 @@
+/* vim:set ft=c ts=2 sw=2 sts=2 et cindent: */
+
+#include "rpc_amqp_utils.hpp"
+
+#include <string>
+#include <stdexcept>
+
+#include <stdarg.h>
+#include <stdlib.h>
+#include <stdio.h>
+#include <stdint.h>
+#include <string.h>
+#include <ctype.h>
+#include <syslog.h>
+
+#include <amqp.h>
+#include <amqp_framing.h>
+
+
+void die(const char *fmt, ...)
+{
+  const size_t BUFF_SIZE = 256;
+  char buff[BUFF_SIZE];
+  buff[BUFF_SIZE] = 0;
+  va_list ap;
+  va_start(ap, fmt);
+  vsnprintf(buff, BUFF_SIZE-1, fmt, ap);
+  va_end(ap);
+  throw std::runtime_error(buff);
+}
+
+void die_on_error(int x, char const *context)
+{
+  if (x < 0) {
+    die("%s: %s\n", context, amqp_error_string2(x));
+  }
+}
+
+// If synchronous AMQP API methods fail (return NULL) use this.
+// Get the last global amqp_rpc_reply (per-connection-global amqp_rpc_reply_t)
+// normal operation: AMQP_RESPONSE_NORMAL -> RPC completed successfully.
+// error: AMQP_RESPONSE_SERVER_EXCEPTION (conn closed, channel closed, library exception)
+void die_on_amqp_error(amqp_rpc_reply_t x, char const *context)
+{
+  switch (x.reply_type) {
+
+    case AMQP_RESPONSE_NORMAL:
+      return;
+
+
+    case AMQP_RESPONSE_NONE:
+      die("%s: missing RPC reply type!\n", context);
+      break;
+
+
+    case AMQP_RESPONSE_LIBRARY_EXCEPTION:
+      die("%s: %s\n", context, amqp_error_string2(x.library_error));
+      break;
+
+
+    case AMQP_RESPONSE_SERVER_EXCEPTION:
+      switch (x.reply.id) {
+        case AMQP_CONNECTION_CLOSE_METHOD: {
+                                             amqp_connection_close_t *m = (amqp_connection_close_t *) x.reply.decoded;
+                                             die("%s: server connection error %uh, message: %.*s\n",
+                                                 context,
+                                                 m->reply_code,
+                                                 (int) m->reply_text.len, (char *) m->reply_text.bytes);
+                                             break;
+                                           }
+        case AMQP_CHANNEL_CLOSE_METHOD: {
+                                          amqp_channel_close_t *m = (amqp_channel_close_t *) x.reply.decoded;
+                                          die("%s: server channel error %uh, message: %.*s\n",
+                                              context,
+                                              m->reply_code,
+                                              (int) m->reply_text.len, (char *) m->reply_text.bytes);
+                                          break;
+                                        }
+        default:
+                                        die("%s: unknown server error, method id 0x%08X\n", context, x.reply.id);
+                                        break;
+      }
+      break;
+
+
+    default:
+      die("%s: unknown server error, reply_type 0x%08X\n", context, x.reply_type);
+      break;
+  }
+  // code never reaches here
+}
+
+
+void throw_ex_on_amqp_error(amqp_connection_state_t conn, amqp_rpc_reply_t x, char const *context)
+{
+  const size_t buff_size = 255;
+  char buff[buff_size+1];
+
+  buff[buff_size+1] = 0;
+
+
+  switch (x.reply_type) {
+
+    case AMQP_RESPONSE_NORMAL:
+      return;
+
+
+    case AMQP_RESPONSE_NONE:
+      snprintf(buff, buff_size,"%s: missing RPC reply type!\n", context);
+      break;
+
+
+    case AMQP_RESPONSE_LIBRARY_EXCEPTION:
+      snprintf(buff, buff_size,"%s: %s\n", context, amqp_error_string2(x.library_error));
+      break;
+
+
+    case AMQP_RESPONSE_SERVER_EXCEPTION:
+      switch (x.reply.id) {
+        case AMQP_CONNECTION_CLOSE_METHOD: {
+                                             amqp_connection_close_t *m = (amqp_connection_close_t *) x.reply.decoded;
+                                             snprintf(buff, buff_size,"%s: server connection error %uh, message: %.*s\n",
+                                                 context,
+                                                 m->reply_code,
+                                                 (int) m->reply_text.len, (char *) m->reply_text.bytes);
+                                             break;
+                                           }
+        case AMQP_CHANNEL_CLOSE_METHOD: {
+                                          amqp_channel_close_t *m = (amqp_channel_close_t *) x.reply.decoded;
+                                          snprintf(buff, buff_size,"%s: server channel error %uh, message: %.*s\n",
+                                              context,
+                                              m->reply_code,
+                                              (int) m->reply_text.len, (char *) m->reply_text.bytes);
+                                          break;
+                                        }
+        default:
+                                        snprintf(buff, buff_size,"%s: unknown server error, method id 0x%08X\n", context, x.reply.id);
+                                        break;
+      }
+      break;
+
+
+    default:
+      snprintf(buff, buff_size,"%s: unknown server error, reply_type 0x%08X\n", context, x.reply_type);
+      break;
+  }
+
+  if (AMQP_RESPONSE_NORMAL != x.reply_type)
+  {
+    die_on_error(amqp_destroy_connection(conn), "Ending connection before throw runtime exception");
+    throw std::runtime_error(buff);
+  }
+}
+
+
+
+
+
+
+
+// output
+
+
+static void dump_row(long count, int numinrow, int *chs)
+{
+  int i;
+
+  printf("%08lX:", count - numinrow);
+
+  if (numinrow > 0) {
+    for (i = 0; i < numinrow; i++) {
+      if (i == 8) {
+        printf(" :");
+      }
+      printf(" %02X", chs[i]);
+    }
+    for (i = numinrow; i < 16; i++) {
+      if (i == 8) {
+        printf(" :");
+      }
+      printf("   ");
+    }
+    printf("  ");
+    for (i = 0; i < numinrow; i++) {
+      if (isprint(chs[i])) {
+        printf("%c", chs[i]);
+      } else {
+        printf(".");
+      }
+    }
+  }
+  printf("\n");
+}
+
+static int rows_eq(int *a, int *b)
+{
+  int i;
+
+  for (i=0; i<16; i++)
+    if (a[i] != b[i]) {
+      return 0;
+    }
+
+  return 1;
+}
+
+void amqp_dump(void const *buffer, size_t len)
+{
+  unsigned char *buf = (unsigned char *) buffer;
+  long count = 0;
+  int numinrow = 0;
+  int chs[16];
+  int oldchs[16] = {0};
+  int showed_dots = 0;
+  size_t i;
+
+  for (i = 0; i < len; i++) {
+    int ch = buf[i];
+
+    if (numinrow == 16) {
+      int j;
+
+      if (rows_eq(oldchs, chs)) {
+        if (!showed_dots) {
+          showed_dots = 1;
+          printf("          .. .. .. .. .. .. .. .. : .. .. .. .. .. .. .. ..\n");
+        }
+      } else {
+        showed_dots = 0;
+        dump_row(count, numinrow, chs);
+      }
+
+      for (j=0; j<16; j++) {
+        oldchs[j] = chs[j];
+      }
+
+      numinrow = 0;
+    }
+
+    count++;
+    chs[numinrow++] = ch;
+  }
+
+  dump_row(count, numinrow, chs);
+
+  if (numinrow != 0) {
+    printf("%08lX:\n", count);
+  }
+}
diff --git a/data-access/engine/src/vlkbd/src/rpc_amqp_utils.hpp b/data-access/engine/src/vlkbd/src/rpc_amqp_utils.hpp
new file mode 100644
index 0000000000000000000000000000000000000000..2ada690149c65731f78f659e8aa2d926adc840b7
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/rpc_amqp_utils.hpp
@@ -0,0 +1,20 @@
+/* vim:set ft=c ts=2 sw=2 sts=2 et cindent: */
+#ifndef RPC_AMQP_UTILS_HPP
+#define RPC_AMQP_UTILS_HPP
+
+#include <amqp.h>
+
+void die(const char *fmt, ...);
+extern void die_on_error(int x, char const *context);
+extern void die_on_amqp_error(amqp_rpc_reply_t x, char const *context);
+
+extern void amqp_dump(void const *buffer, size_t len);
+
+extern uint64_t now_microseconds(void);
+extern void microsleep(int usec);
+
+
+void throw_ex_on_amqp_error(amqp_connection_state_t conn, amqp_rpc_reply_t x, char const *context);
+
+
+#endif
diff --git a/data-access/engine/src/vlkbd/src/vlkbd.cpp b/data-access/engine/src/vlkbd/src/vlkbd.cpp
new file mode 100644
index 0000000000000000000000000000000000000000..b0d9fcad69cedc0bd4ecffc712e99ceab23c9a33
--- /dev/null
+++ b/data-access/engine/src/vlkbd/src/vlkbd.cpp
@@ -0,0 +1,169 @@
+// vlkbd
+// daemon for VLKB services 
+
+#include "rpc_amqp.hpp"
+
+#include <iostream>
+#include <stdexcept>
+
+#include <stdlib.h>
+#include <stdio.h>
+#include <string.h>
+#include <stdint.h>
+
+// daemon related START
+#include <sys/types.h>
+#include <sys/stat.h>
+#include <sys/signal.h>
+#include <fcntl.h>
+#include <errno.h>
+#include <unistd.h>
+#include <syslog.h>
+// daemon related END
+
+
+#define DAEMON_NAME "vlkbd"
+
+char dname[256];
+
+
+using namespace std;
+
+void signal_handler(int sig) {
+
+   switch(sig){
+      case SIGTERM:
+         syslog(LOG_INFO,"received SIGTERM signal. Exiting ...");
+         rpc_cleanup();
+         exit(EXIT_SUCCESS);
+         break;
+
+      default:
+         syslog(LOG_WARNING,"received unhandled signal (%d) %s",
+               sig,strsignal(sig));
+         break;
+
+   }
+}
+
+
+void daemonize(void) {
+
+   /* Our process ID and Session ID */
+   pid_t pid, sid;
+
+   /* Fork off the parent process */
+   pid = fork();
+   if (pid < 0) {
+      exit(EXIT_FAILURE);
+   }
+   /* If we got a good PID, then
+      we can exit the parent process. */
+   if (pid > 0) {
+      exit(EXIT_SUCCESS);
+   }
+
+   // insert signal handler here...
+   signal(SIGTERM, signal_handler);
+
+   /* Change the file mode mask */
+   umask(0);
+
+   // Open any logs here ...
+   setlogmask(LOG_UPTO(LOG_INFO));
+   sprintf(dname,"%s[%d]",DAEMON_NAME,getpid());
+   openlog(dname, LOG_CONS, LOG_USER);
+   syslog(LOG_INFO,"service started.");
+
+   /* Create a new SID for the child process */
+   sid = setsid();
+   if (sid < 0) {
+      /* Log the failure */
+      exit(EXIT_FAILURE);
+   }
+
+   /* Change the current working directory */
+   if ((chdir("/")) < 0) {
+      /* Log the failure */
+      exit(EXIT_FAILURE);
+   }
+
+   /* Keep the standard file descriptors 0, 1, 2 occupied
+      so as no socket can have it; to guard the socket 
+      against 'forgotten' printf's */
+
+   freopen ("/dev/null", "r", stdin);
+   freopen ("/dev/null", "w", stdout);
+   freopen ("/dev/null", "w", stderr);
+
+   /* Daemon-specific initialization goes here */
+}
+
+
+
+std::string base_name(std::string path)
+{
+   return path.substr(path.find_last_of("//") + 1); 
+   // FIXME replace with basename
+}
+
+
+
+void usage(const string progname)
+{
+   cerr
+      << "Usage: " << progname << " host port queuename conf-pathname [-t]" << endl
+      << endl
+      << " -t : stay on terminal, don't run as daemon" << endl
+      << endl
+      << "Version: " << VERSIONSTR << " " << BUILD << endl;
+}
+
+
+
+int main(int argc, char  *argv[])
+{
+   const std::string progname = base_name(argv[0]);
+
+   if ((argc != 5) && (argc != 6))
+   {
+      usage(progname);
+      return 1;
+   }
+
+   int daemon = (argc == 5);
+   if(daemon)
+      daemonize();
+
+   // now stdin/out/err are redirected
+
+
+   // start AMQP-consumer
+
+   string hostname(argv[1]);
+   int port = atoi(argv[2]);
+   string queuename(argv[3]);
+   string user_name("guest");
+   string password("guest");
+   string settings_pathname(argv[4]);
+   //string settings_pathname("/etc/vlkb/datasets.conf");
+   // FIXME uname passwd  put to conf file or args - cannot start without them
+
+   try
+   {
+      rpc_run(user_name, password, hostname, port, queuename, settings_pathname);
+   }
+   catch (std::runtime_error const& e)
+   {
+      if(daemon)
+         syslog(LOG_ERR, e.what());
+      else
+         std::cerr << "Runtime error: " << e.what() << std::endl;
+
+      return 2;
+   }
+
+   syslog(LOG_INFO,"service stoped. Exiting...");
+
+   return 0;
+}
diff --git a/data-access/engine/src/vlkbd/vlkbd.1 b/data-access/engine/src/vlkbd/vlkbd.1
new file mode 100644
index 0000000000000000000000000000000000000000..d4c3cf9e3da11c257baae61e03b3dd797f249fa1
--- /dev/null
+++ b/data-access/engine/src/vlkbd/vlkbd.1
@@ -0,0 +1,22 @@
+.\"                                      Hey, EMACS: -*- nroff -*-
+.\" (C) Copyright 2023 ...
+.\"
+.TH vlkbd 1 
+.SH NAME
+vlkbd \- vlkbd application
+.SH SYNOPSIS
+.B vlkbd 
+.SH DESCRIPTION
+The 
+.B vlkbd 
+is an engine to perform data access (cutout, multicutout and demosaicing) of FITS files.
+List of actual sub-commands is printed by --help.
+.SH SEE ALSO
+.BR vlkb, vlkb-obscore, vlkbd (1).
+.SH AUTHORS
+The
+.B vlkbd 
+was written by 
+RBu <rbu@ia2.inaf.it>
+.PP
+This document was written by RBu <rbu@ia2.inaf.it> for Debian.
diff --git a/data-access/engine/src/vlkbd/vlkbd.changelog.Debian b/data-access/engine/src/vlkbd/vlkbd.changelog.Debian
new file mode 100644
index 0000000000000000000000000000000000000000..812a6754059b4c0dd522725f3fc145b3a0c77238
--- /dev/null
+++ b/data-access/engine/src/vlkbd/vlkbd.changelog.Debian
@@ -0,0 +1,13 @@
+vlkbd (1.4.8) stable; urgency=low
+
+  [ VLKB ]
+  * First release via deb and rpm packages.
+
+ -- INAF <RBu@ia2.inaf.com>  Thu,  23 Dec 2023 11:30:00 +0100 
+
+vlkbd (1.4.7) stable; urgency=low
+
+  [ INAF ]
+  * Adds support for SODA parameters (http://ivoa.net/documents).
+
+ -- INAF <RBu@ia2.inaf.org>  Wed,   4 Oct 2023 11:00:00 +0100
diff --git a/data-access/engine/src/vlkbd/vlkbd.control b/data-access/engine/src/vlkbd/vlkbd.control
new file mode 100644
index 0000000000000000000000000000000000000000..5b0ae0c617677d6b957ba8d71ea1139f1424fe23
--- /dev/null
+++ b/data-access/engine/src/vlkbd/vlkbd.control
@@ -0,0 +1,8 @@
+Package: vlkbd
+Version:
+Section: utils
+Priority: optional
+Architecture: all
+Maintainer: VLKB <RBu@ia2.vlkb.org>
+Description: This is vlkbd engine to perform data access (cutout, multi-cutout, demosaicing) of FITS-files. List of commands is printed in help.
+
diff --git a/data-access/engine/src/vlkbd/vlkbd.copyright b/data-access/engine/src/vlkbd/vlkbd.copyright
new file mode 100644
index 0000000000000000000000000000000000000000..6d86f408d7212bc8d9bdb9833dd80346e39188ed
--- /dev/null
+++ b/data-access/engine/src/vlkbd/vlkbd.copyright
@@ -0,0 +1,14 @@
+vlkbd
+
+Copyright: 2023 INAF <ia2@inaf.com>
+
+2023-10-30
+
+The entire code base may be distributed under the terms of the GNU General
+Public License (GPL), which appears immediately below.  Alternatively, all
+of the source code as any code derived from that code may instead be
+distributed under the GNU Lesser General Public License (LGPL), at the
+choice of the distributor. The complete text of the LGPL appears at the
+bottom of this file.
+
+See /usr/share/common-licenses/(GPL|LGPL)
diff --git a/data-access/engine/src/vlkbd/vlkbd.datasets.conf b/data-access/engine/src/vlkbd/vlkbd.datasets.conf
new file mode 100644
index 0000000000000000000000000000000000000000..bccc41819036738345cde389866cc381c672eb2f
--- /dev/null
+++ b/data-access/engine/src/vlkbd/vlkbd.datasets.conf
@@ -0,0 +1,10 @@
+
+# path to original files
+fits_path_surveys=/srv/surveys
+# path to generated cutouts
+fits_path_cutouts=/srv/cutouts
+
+# logging records last request only
+# log_dir=/tmp
+# log_filename=vlkbd.log
+
diff --git a/data-access/engine/src/vlkbd/vlkbd.spec b/data-access/engine/src/vlkbd/vlkbd.spec
new file mode 100644
index 0000000000000000000000000000000000000000..ff016c7ce151d3067e29e60306a70b82204d562d
--- /dev/null
+++ b/data-access/engine/src/vlkbd/vlkbd.spec
@@ -0,0 +1,34 @@
+Name: vlkbd
+Version: %{version}
+Release: 1%{?dist}
+Summary: vlkbd
+Source1: vlkbd
+License: GPLv3+
+URL: http://ia2.inaf.it
+BuildRequires: gcc >= 3.2.0, glibc-devel >= 2.17, libstdc++-devel >= 4.8, ast-devel >= 7.3.4, cfitsio-devel >= 3.370, libcsv-devel >= 3.0
+Requires: glibc >= 2.17, libstdc++ >= 4.8, ast >= 7.3.4, cfitsio >= 3.370, libcsv >= 3.0 
+
+%description
+This utility ia part of a VLKB-suite (ViaLactea Knowledge Base) to manipulate or calculate information abount
+coordinates systems in a FITS-file. Set of actual commands is printed in help.
+
+
+%prep
+
+%build
+
+
+%install
+mkdir -p %{buildroot}%{_prefix}/bin
+install -m 755 %{SOURCE1} %{buildroot}%{_prefix}/bin
+%files
+%{_bindir}/vlkbd
+
+
+%post
+
+%postun
+
+
+%changelog
+
diff --git a/data-access/engine/vlkbd_exec.sh b/data-access/engine/vlkbd_exec.sh
new file mode 100755
index 0000000000000000000000000000000000000000..f9c87f1064afd81482f72f9034972bf5a09313ad
--- /dev/null
+++ b/data-access/engine/vlkbd_exec.sh
@@ -0,0 +1,44 @@
+
+# how to all vlkbd in Makefile:
+
+# killall -q vlkbd-$(VERNUM); test $$? -eq 1
+
+
+if [ "$#" -lt 1 ]; then
+    echo -e "Run vlkbd-<version> on all CPU-cores, connecting to RabbitMQ on <amqphost>.\nUsage:\n\t $0 <amqphost> <queue_name> <datasets_conf>\n"
+    exit
+fi
+
+ncores=$(grep '^processor' /proc/cpuinfo | sort -u | wc -l)
+
+AMQPHOST=$1
+
+
+for core in $(seq 0 $(expr $ncores - 1) )
+do
+
+   # FIXME /usr/local should be configurable from engine/Makefile -> INSTALL_DIR
+   # or vlkbd_exec.sh sould be under resources and handled together with datasets.conf
+   taskset -c $core vlkbd $AMQPHOST 5672 $2 $3
+
+done
+
+ps ax | grep vlkbd
+
+
+
+# with bitmask:
+
+# usage: run on 2nd cpu ore
+#taskset 0x02 ./vlkb_amqp localhost 5672 test
+# run on 1st cpu ore
+#taskset 0x01 ./vlkb_amqp localhost 5672 test
+# Note that a bitmask uses "hexadecimal" notation. 
+# "0x11" is "00010001" in a binary format, which corresponds 
+# to CPU core 0 and 4. 
+# CPU core 0 and 1 is represented by CPU affinity "0x3".
+# taskset $1 ./vlkb_amqp localhost 5672 test
+
+
+# use defualt exchange
+#./amqp_listen localhost 5672 "" "test"
diff --git a/data-access/servlet/Makefile b/data-access/servlet/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..45df132bf152f3069dea285b90075d5a36d23fda
--- /dev/null
+++ b/data-access/servlet/Makefile
@@ -0,0 +1,99 @@
+#===============================================================================
+NAME       := vlkb-cutout
+VERSION    := $(shell git describe)
+WEBAPP_WAR := $(NAME)-$(VERSION).war
+CONTEXT_ROOT ?= vlkb/datasets
+#===============================================================================
+LIB_DIR     ?= ../../java-libs/lib
+CLASS_DIR   := target/classes
+INSTALL_DIR ?= target/webapp
+AUTH_DIR    := ../../auth
+#===============================================================================
+# all sources
+SRC_DIR = src/main/java/:src/main/java/common:src/main/java/datasets:src/main/java/datasets/json-rpc:src/main/java/common/vo:src/main/java/common/vo/soda:src/main/java/resolver:src/main/java/webapi:src/main/java/webapi/output:$(AUTH_DIR)/src/main/java
+VOSI     = src/main/java/vosi/VlkbServletFile.java
+#IA2CONVFILTER = $(AUTH_DIR)/src/main/java/IA2TokenConvFilter.java
+AUTHFILTERS  = $(wildcard $(AUTH_DIR)/src/main/java/*Filter.java) $(AUTH_DIR)/src/main/java/AuthPolicy.java
+FILTERS  = $(wildcard src/main/java/webapi/*Filter.java)
+FILTERS  += $(wildcard src/main/java/authz/*Filter.java)
+SERVLETS = $(wildcard src/main/java/webapi/Servlet*.java)
+#SERVLETS = $(wildcard src/main/java/webapi/*Servlet.java)
+#===============================================================================
+JFLAGS = -g
+CLASSPATH = $(LIB_DIR)/*
+#===============================================================================
+
+.PHONY: build
+build:
+	echo "class Version { static String asString = \"$(VERSION)\";}" > src/main/java/Version.java
+	javac $(JFLAGS) -cp :$(CLASSPATH) -sourcepath $(SRC_DIR) -d $(CLASS_DIR) $(SERVLETS) $(FILTERS) $(AUTHFILTERS) $(VOSI)
+
+.PHONY: clean
+clean : 
+	rm -fr src/main/java/Version.java target
+
+
+.PHONY: install
+install:
+	mkdir -p $(INSTALL_DIR)
+	cp -r src/main/webapp/* $(INSTALL_DIR)
+	cp -r $(CLASS_DIR) $(LIB_DIR) $(INSTALL_DIR)/WEB-INF/
+	cp ../../java-libs/jjwt-*0.11.2.jar $(INSTALL_DIR)/WEB-INF/lib/
+	cp src/main/resources/*.properties $(INSTALL_DIR)/WEB-INF/classes/
+	cp src/main/resources/*.conf $(INSTALL_DIR)/WEB-INF/classes/
+	cp $(AUTH_DIR)/resources/*.properties $(INSTALL_DIR)/WEB-INF/classes/
+
+.PHONY: uninstall
+uninstall:
+	rm -rf $(INSTALL_DIR)
+
+
+.PHONY: war
+war:
+	@jar -cf target/$(WEBAPP_WAR) -C $(INSTALL_DIR) index.html\
+			  $(INSTALL_DIR)/META-INF/*.xml \
+			  $(INSTALL_DIR)/WEB-INF/*.xml \
+			  $(INSTALL_DIR)/WEB-INF/classes/* \
+			  $(INSTALL_DIR)/WEB-INF/lib/*.jar
+
+.PHONY:
+create-war: clean build install war
+
+
+
+# vlkb-devel host local
+
+.PHONY: vlkb-devel
+vlkb-devel: stop uninstall clean build install config war start
+
+
+.PHONY: config
+config:
+	cp config/*.ini config/*.properties config/cutout.properties target/webapp/WEB-INF/classes
+	cp config/context.xml       target/webapp/META-INF
+	cp config/web.xml           target/webapp/WEB-INF
+
+
+
+.PHONY: start
+start:
+	curl -T target/$(WEBAPP_WAR) -u admin:IA2lbt09 'http://localhost:8080/manager/text/deploy?path=/$(CONTEXT_ROOT)&update=true'
+
+.PHONY: stop
+stop:
+	-@curl -u  admin:IA2lbt09 'http://localhost:8080/manager/text/undeploy?path=/$(CONTEXT_ROOT)'
+
+.PHONY: status
+status:
+	curl localhost:8080/manager/text/list -u admin:IA2lbt09
+
+.PHONY: reload
+reload:
+	curl -u  admin:IA2lbt09  'http://localhost:8080/manager/text/reload?path=/$(CONTEXT_ROOT)'
+
+
+
+
+
+
+
diff --git a/data-access/servlet/config/Makefile b/data-access/servlet/config/Makefile
new file mode 100644
index 0000000000000000000000000000000000000000..93856fc8b2fd5a498ac263c9f8b99c185b668221
--- /dev/null
+++ b/data-access/servlet/config/Makefile
@@ -0,0 +1,42 @@
+################################################################
+# args
+AMQP_QUEUE ?= vlkbdevel
+
+# localhost pasquale pasqule-devel
+DBMS ?= localhost
+
+# test prod
+#FITSDB ?= test
+
+CONTEXT_ROOT ?= vlkb/datasets
+
+# <empty> ia2token garrtoken basic
+AUTH ?=
+#NAUTHZ ?= anonymous
+
+################################################################
+
+all: authpolicy.properties cutout.properties context.xml web.xml
+
+web.xml:
+	cat web-xml/web.xml-begining web-xml/web.xml-$(AUTH)-filter web-xml/web.xml-authz-filter web-xml/web.xml-servlets web-xml/web.xml-ending > web.xml
+
+authpolicy.properties: dbms.conf-$(DBMS)
+	cp dbms.conf-$(DBMS) $@
+
+# DB needed for ResolverByObsCore (VLKB-only)
+cutout.properties: cutout.properties.in
+	cat dbms.conf-$(DBMS) cutout.properties.in > cutout.properties
+	sed -i 's/AMQP_QUEUE/$(AMQP_QUEUE)/' cutout.properties
+	#sed -i 's/FITSDB/$(FITSDB)/' cutout.properties
+	sed -i 's|CONTEXT_ROOT|$(CONTEXT_ROOT)|' cutout.properties
+
+context.xml: context.xml.in
+	cp context.xml.in context.xml
+	#sed -i 's/FITSDB/$(FITSDB)/' context.xml
+
+
+.PHONY:
+clean:
+	-rm authpolicy.properties context.xml web.xml cutout.properties
+
diff --git a/data-access/servlet/config/auth.properties b/data-access/servlet/config/auth.properties
new file mode 100644
index 0000000000000000000000000000000000000000..c9c8aee27f0017b03a10a17896236eae4a93a018
--- /dev/null
+++ b/data-access/servlet/config/auth.properties
@@ -0,0 +1,10 @@
+rap_uri=https://sso.ia2.inaf.it/rap-ia2
+gms_uri=https://sso.ia2.inaf.it/gms
+client_id=vospace_ui_demo
+client_secret=VOSpaceDemo123
+
+groups_autoload=true
+store_state_on_login_endpoint=true
+scope=openid email profile read:rap
+
+allow_anonymous_access=true
diff --git a/data-access/servlet/config/context.xml.in b/data-access/servlet/config/context.xml.in
new file mode 100644
index 0000000000000000000000000000000000000000..8b89784e107aa35d79a5e8390d3b368ebfeb01db
--- /dev/null
+++ b/data-access/servlet/config/context.xml.in
@@ -0,0 +1,16 @@
+
+<!-- below flag 'true' allows request.getParts() to fucntion otherwise throws exception -->
+<Context allowCasualMultipartParsing="true">
+        <Resources allowLinking="true">
+                <PostResources readOnly="false"
+                        className="org.apache.catalina.webresources.DirResourceSet"
+                        base="/srv/vlkb/cutouts"
+                        webAppMount="/cutouts"/>
+                <PostResources readOnly="true"
+                        className="org.apache.catalina.webresources.DirResourceSet"
+                        base="/srv/vlkb/surveys"
+                        webAppMount="/surveys"/>
+        </Resources>
+
+</Context>
+
diff --git a/data-access/servlet/config/cutout.properties.in b/data-access/servlet/config/cutout.properties.in
new file mode 100644
index 0000000000000000000000000000000000000000..96e9782c9eac23d9fe8ebf261fc2f9a1a69f7c38
--- /dev/null
+++ b/data-access/servlet/config/cutout.properties.in
@@ -0,0 +1,38 @@
+
+# path to FITS-file collections and cutouts store
+fits_path_surveys=/srv/vlkb/surveys
+
+# interpretatioon of parameters values as in VLKB
+default_sky_system=GALACTIC
+default_spec_system=VELO_LSRK
+
+# MIME-type of the response (only one of [1][2][3])
+
+# [1]:
+default_response_format=application/x-vlkb+xml
+
+# [2]:
+# default_response_format=application/fits;createfile=yes
+# fits_path_cutouts=/srv/vlkb/cutouts
+# amqp_host_name=localhost
+# amqp_port=5672
+# amqp_routing_key=AMQP_QUEUE
+
+# [3]:
+# default_response_format=application/x-vlkb+xml
+# surveys_metadata_abs_pathname=/srv/vlkb/surveys/survey_populate.csv
+# fits_path_cutouts=/srv/vlkb/cutouts
+# fits_url_cutouts=http://localhost:8080/CONTEXT_ROOT/cutouts
+# amqp_host_name=localhost
+# amqp_port=5672
+# amqp_routing_key=AMQP_QUEUE
+
+
+
+
+# other
+
+# should response include duration of request execution: yes | no
+show_duration=yes
+
+# database for resover by mapping: key->path/to/fitsfile
diff --git a/data-access/servlet/config/dbms.conf-localhost b/data-access/servlet/config/dbms.conf-localhost
new file mode 100644
index 0000000000000000000000000000000000000000..1c59ef6ea99316ff778ca7dda6cb2cb3493aa9b3
--- /dev/null
+++ b/data-access/servlet/config/dbms.conf-localhost
@@ -0,0 +1,6 @@
+db_uri=jdbc:postgresql://127.0.0.1:5432/vialactea
+db_schema=datasets
+db_user_name=vialactea
+db_password=ia2vlkb
+
+
diff --git a/data-access/servlet/config/dbms.conf-pasquale b/data-access/servlet/config/dbms.conf-pasquale
new file mode 100644
index 0000000000000000000000000000000000000000..20542f00a2ec1138ba6f2d498851d77f99a8e9c3
--- /dev/null
+++ b/data-access/servlet/config/dbms.conf-pasquale
@@ -0,0 +1,6 @@
+db_uri=jdbc:postgresql://pasquale.ia2.inaf.it:5432/vialactea
+db_schema=datasets
+db_user_name=vialactea
+db_password=ia2vlkb
+
+
diff --git a/data-access/servlet/config/dbms.conf-pasquale-devel b/data-access/servlet/config/dbms.conf-pasquale-devel
new file mode 100644
index 0000000000000000000000000000000000000000..1a16c5bd941617c74e3c98e9ef469133452bb2ef
--- /dev/null
+++ b/data-access/servlet/config/dbms.conf-pasquale-devel
@@ -0,0 +1,7 @@
+db_uri=jdbc:postgresql://pasquale.ia2.inaf.it:5432/vialacteadevel
+db_port=5432
+db_schema=datasetsdevel
+db_user_name=vialactea
+db_password=ia2vlkb
+
+
diff --git a/data-access/servlet/config/iamtoken.properties b/data-access/servlet/config/iamtoken.properties
new file mode 100644
index 0000000000000000000000000000000000000000..e0935bb1f2d6f832b04b22c9dac817eac6741e5d
--- /dev/null
+++ b/data-access/servlet/config/iamtoken.properties
@@ -0,0 +1,10 @@
+
+#jwks_url=https://iam-escape.cloud.cnaf.infn.it/jwk
+introspect=https://iam-escape.cloud.cnaf.infn.it/introspect
+client_name=02cc260f-9837-4907-b2cb-a1a2d764fb15
+client_password=AJMi3qrB6AHRp_6y55tEwU-IpJ8uZ6X4QXeQ3W4la6dc-BlkzAY1OQpAE9hb1W7-VfYl4208FUtjE2Cl3hUYLkQ
+
+resource_id=vlkb
+
+non_authn_username=anonymous
+
diff --git a/data-access/servlet/config/neatoken.properties b/data-access/servlet/config/neatoken.properties
new file mode 100644
index 0000000000000000000000000000000000000000..21793e2600441bc6122e1ce54387ad8525bbd297
--- /dev/null
+++ b/data-access/servlet/config/neatoken.properties
@@ -0,0 +1,7 @@
+
+jwks_url=https://sso.neanias.eu/auth/realms/neanias-production/protocol/openid-connect/certs
+
+resource_id=vlkb
+
+non_authn_username=anonymous
+
diff --git a/data-access/servlet/config/shiro.ini b/data-access/servlet/config/shiro.ini
new file mode 100644
index 0000000000000000000000000000000000000000..4324a25031ec4eab8cdd1b1aa288afd17f3300eb
--- /dev/null
+++ b/data-access/servlet/config/shiro.ini
@@ -0,0 +1,40 @@
+#
+# Copyright (c) 2013 Les Hazlewood and contributors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# INI configuration is very powerful and flexible, while still remaining succinct.
+# Please http://shiro.apache.org/configuration.html and
+# http://shiro.apache.org/web.html for more.
+
+[main]
+shiro.loginUrl = /login.jsp
+cacheManager = org.apache.shiro.cache.MemoryConstrainedCacheManager
+securityManager.cacheManager = $cacheManager
+#securityManager.realm = $stormpathRealm
+
+[users]
+# syntax: user = password , roles
+vialactea = ia2vlkb, ROLE_ADMIN
+
+[roles]
+ROLE_ADMIN = *
+
+[urls]
+#/login.jsp = authc
+/logout = logout
+/** = authcBasic
+#/ivoa/resources/basic/** = authcBasic
+#/ivoa/resources/full/** = authc
+
diff --git a/data-access/servlet/config/web-xml/web.xml--filter b/data-access/servlet/config/web-xml/web.xml--filter
new file mode 100644
index 0000000000000000000000000000000000000000..84ce5ae3c3463d026d04c046efb727363623087c
--- /dev/null
+++ b/data-access/servlet/config/web-xml/web.xml--filter
@@ -0,0 +1,3 @@
+
+<!-- no authorization filter configured -->
+
diff --git a/data-access/servlet/config/web-xml/web.xml-authz-filter b/data-access/servlet/config/web-xml/web.xml-authz-filter
new file mode 100644
index 0000000000000000000000000000000000000000..344a84227bc347765b89f346b373a2494c9c4836
--- /dev/null
+++ b/data-access/servlet/config/web-xml/web.xml-authz-filter
@@ -0,0 +1,9 @@
+        <filter>
+                <filter-name>AuthZFilter</filter-name>
+                <filter-class>AuthZFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>AuthZFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
diff --git a/data-access/servlet/config/web-xml/web.xml-begining b/data-access/servlet/config/web-xml/web.xml-begining
new file mode 100644
index 0000000000000000000000000000000000000000..cb551fabbc25b697e0340cdee9d22cc51115c461
--- /dev/null
+++ b/data-access/servlet/config/web-xml/web.xml-begining
@@ -0,0 +1,12 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<!--
+ Copyright 2004-2005 Sun Microsystems, Inc.  All rights reserved.
+ Use is subject to license terms.
+-->
+
+<web-app version="2.4" xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/j2ee http://java.sun.com/xml/ns/j2ee/web-app_2_4.xsd">
+        <display-name>Via Lactea. Query FITS datacubes.</display-name>
+        <distributable/>
+
+
diff --git a/data-access/servlet/config/web-xml/web.xml-ending b/data-access/servlet/config/web-xml/web.xml-ending
new file mode 100644
index 0000000000000000000000000000000000000000..9d1c7ed2dc6d32a1171a8a7a4ff16eacb13312b2
--- /dev/null
+++ b/data-access/servlet/config/web-xml/web.xml-ending
@@ -0,0 +1 @@
+</web-app>
diff --git a/data-access/servlet/config/web-xml/web.xml-garrtoken-filter b/data-access/servlet/config/web-xml/web.xml-garrtoken-filter
new file mode 100644
index 0000000000000000000000000000000000000000..2d9d53f9b312d38900d05ed0d1bd65233a6af83e
--- /dev/null
+++ b/data-access/servlet/config/web-xml/web.xml-garrtoken-filter
@@ -0,0 +1,26 @@
+        <filter>
+                <filter-name>NeaTokenFilter</filter-name>
+                <filter-class>NeaTokenFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>NeaTokenFilter</filter-name>
+                <url-pattern>/vlkb_cutout</url-pattern>
+        </filter-mapping>
+        <filter-mapping>
+                <filter-name>NeaTokenFilter</filter-name>
+                <url-pattern>/vlkb_mcutout</url-pattern>
+        </filter-mapping>
+        <filter-mapping>
+                <filter-name>NeaTokenFilter</filter-name>
+                <url-pattern>/vlkb_merge</url-pattern>
+        </filter-mapping>
+        <filter-mapping>
+                <filter-name>NeaTokenFilter</filter-name>
+                <url-pattern>/vlkb_soda</url-pattern>
+        </filter-mapping>
+        <filter-mapping>
+                <filter-name>NeaTokenFilter</filter-name>
+                <url-pattern>/soda</url-pattern>
+        </filter-mapping>
+
+
diff --git a/data-access/servlet/config/web-xml/web.xml-ia2token-filter b/data-access/servlet/config/web-xml/web.xml-ia2token-filter
new file mode 100644
index 0000000000000000000000000000000000000000..15b061ff0d497bd1c6a9e0b98df5cef9f1108030
--- /dev/null
+++ b/data-access/servlet/config/web-xml/web.xml-ia2token-filter
@@ -0,0 +1,21 @@
+
+       <filter>
+               <filter-name>IA2TokenFilter</filter-name>
+               <filter-class>it.inaf.ia2.aa.TokenFilter</filter-class>
+       </filter>
+
+       <filter-mapping>
+               <filter-name>IA2TokenFilter</filter-name>
+               <url-pattern>/*</url-pattern>
+       </filter-mapping>
+
+       <filter>
+               <filter-name>UserTypeConverter</filter-name>
+               <filter-class>IA2TokenConvFilter</filter-class>
+       </filter>
+
+       <filter-mapping>
+               <filter-name>UserTypeConverter</filter-name>
+               <url-pattern>/*</url-pattern>
+       </filter-mapping>
+
diff --git a/data-access/servlet/config/web-xml/web.xml-monitor-filter b/data-access/servlet/config/web-xml/web.xml-monitor-filter
new file mode 100644
index 0000000000000000000000000000000000000000..ecefb2e0b3014b08fa18997795b54f118814a852
--- /dev/null
+++ b/data-access/servlet/config/web-xml/web.xml-monitor-filter
@@ -0,0 +1,9 @@
+        <filter>
+                <filter-name>MonitorFilter</filter-name>
+                <filter-class>MonitorFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>MonitorFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
diff --git a/data-access/servlet/config/web-xml/web.xml-servlets b/data-access/servlet/config/web-xml/web.xml-servlets
new file mode 100644
index 0000000000000000000000000000000000000000..a1ffa8e54953edb6c8e791ea48fe777a3c507d6c
--- /dev/null
+++ b/data-access/servlet/config/web-xml/web.xml-servlets
@@ -0,0 +1,146 @@
+
+
+
+    <servlet>
+        <servlet-name>default</servlet-name>
+        <servlet-class>
+          org.apache.catalina.servlets.DefaultServlet
+        </servlet-class>
+        <init-param>
+            <param-name>debug</param-name>
+            <param-value>1</param-value>
+        </init-param>
+        <init-param>
+            <param-name>listings</param-name>
+            <param-value>true</param-value>
+        </init-param>
+        <load-on-startup>1</load-on-startup>
+    </servlet>
+    <servlet-mapping>
+        <servlet-name>default</servlet-name>
+        <url-pattern>/</url-pattern>
+    </servlet-mapping>
+
+
+
+
+
+
+
+
+        <servlet>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <url-pattern>/vlkb_cutout</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <servlet-class>ServletMCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <url-pattern>/vlkb_mcutout</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_merge</servlet-name>
+                <servlet-class>ServletMerge</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_merge</servlet-name>
+                <url-pattern>/vlkb_merge</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <url-pattern>/availability</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <url-pattern>/capabilities</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_soda</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/soda</url-pattern>
+        </servlet-mapping>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/vlkb_soda</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_merge</servlet-name>
+                <servlet-class>UWSMerge</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>merge</param-value>
+                </init-param>
+                 <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_merge</servlet-name>
+                <url-pattern>/uws_merge/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_mcutout</servlet-name>
+                <servlet-class>UWSMCutout</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>mcutout</param-value>
+                </init-param>
+                 <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_mcutout</servlet-name>
+                <url-pattern>/uws_mcutout/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_soda</servlet-name>
+                <servlet-class>UWSSoda</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>soda_uws</param-value>
+                </init-param>
+                 <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_soda</servlet-name>
+                <url-pattern>/soda_uws/*</url-pattern>
+        </servlet-mapping>
+
diff --git a/data-access/servlet/src/main/java/common.tex b/data-access/servlet/src/main/java/common.tex
new file mode 100644
index 0000000000000000000000000000000000000000..9e54b7bbd21589dc01ed3703dfb494861cb3646d
--- /dev/null
+++ b/data-access/servlet/src/main/java/common.tex
@@ -0,0 +1,28 @@
+GOAL three modules:
+
+AuthZ -> Filter: AuthPolicy*
+
+PSearch -> Servlet: ServletPSearch (SIAv2 + VLKB-legacy API)
+
+DataAccess -> Servlet/UWS: cutout/SODA incl MCutout and Merge
+
+
+-----
+
+Settings.java (partly: search does not need dirs)
+
+AuthPolicy* : to separate (filter-)module as Psearch
+
+VlkbSql.java (only common part: VlkbSql::dbConn and VlkbSql::doQuery())
+
+Coord.java
+
+Subsurvey*.java --> Subsurvey::findSubsurvey()  and SubsurveyId is the triple [name,species,tran]
+Subsurvey(Id) :
+*in dacc used to retrieve cards from CSV-in-mem to update headers (vlkb-mode)
+*in search : essential param to filter searches (not SIA?, vlkb-mode) and (vlkb) results.xml sorted by subsurveys
+*Settings implements load CSV to memory currently
+
+----------------------------------------------------
+
+conclude TRULY COMMON: Coord Subsurvey* DBconn&doQuery()
diff --git a/data-access/servlet/src/main/java/common/Coord.java b/data-access/servlet/src/main/java/common/Coord.java
new file mode 100644
index 0000000000000000000000000000000000000000..bd82e90bfaa9b1bfb0596619b0b2a2e2dbed04b8
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/Coord.java
@@ -0,0 +1,28 @@
+
+
+
+class Coord
+{
+
+   String skySystem;  // FIXME enum ICRS | GALACTIC
+   String shape = "CIRCLE";      // FIXME enum CIRCLE | RECT | POLYGON   FIXME replace RECT -> RANGE
+   String specSystem; // FIXME enum VELO_LSRK | WAVE_Barycentric | NONE
+
+   Pos  pos;
+   Band band;
+   Time time;
+   Pol  pol;
+
+   Coord(String skySystem, Pos pos, String specSystem, Band band, Time time, Pol pol)
+   {
+      this.pos  = pos;
+      this.band = band;
+      this.time = time;
+      this.pol  = pol;
+
+      this.skySystem  = skySystem;
+      this.specSystem = specSystem;
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/FitsCard.java b/data-access/servlet/src/main/java/common/FitsCard.java
new file mode 100644
index 0000000000000000000000000000000000000000..3987e99fa27bf883aca48f911be290bccb3d347f
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/FitsCard.java
@@ -0,0 +1,50 @@
+
+
+class FitsCard
+{
+   public String key;
+   public String value;
+   public String comment;
+
+   public static FitsCard[] convertToFitsCard(double restFreq, String velUnitStr)
+   {
+      FitsCard fitsCard_CUNIT3  = new FitsCard();
+      FitsCard fitsCard_RESTFRQ = new FitsCard();
+
+      fitsCard_CUNIT3.key = "CUNIT3";
+      fitsCard_CUNIT3.value = convert_csv_vel_unit_to_fits_vel_unit(velUnitStr);
+      fitsCard_CUNIT3.comment = "key added by VLKB";
+
+      fitsCard_RESTFRQ.key = "RESTFRQ";
+      fitsCard_RESTFRQ.value = String.format("%.0f",restFreq);
+      fitsCard_RESTFRQ.comment = "key added by VLKB";
+
+      FitsCard[] keyRecords = new FitsCard[2];
+
+      keyRecords[0] = fitsCard_CUNIT3;
+      keyRecords[1] = fitsCard_RESTFRQ;
+
+      return keyRecords;
+
+   }
+
+
+   // FIXME should this go to parsing csv file ??
+   private static String convert_csv_vel_unit_to_fits_vel_unit(String csv_vel_unit)
+   {
+      if(csv_vel_unit.equals("m.s**-1"))
+      {
+         return new String("m/s");
+      }
+      else if(csv_vel_unit.equals("km.s**-1"))
+      {
+         return new String("km/s");
+      }
+      else
+      {
+         return csv_vel_unit;
+      }
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/Subsurvey.java b/data-access/servlet/src/main/java/common/Subsurvey.java
new file mode 100644
index 0000000000000000000000000000000000000000..0ffe4d8ccfa11166a285823dc786d5f5b3a30d43
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/Subsurvey.java
@@ -0,0 +1,132 @@
+
+/* NOTE originally was in search/output : designed for serializing search output xml */
+
+import java.util.logging.Logger;
+
+/* for Csv-loadSubsurveys */
+import java.io.IOException;
+import com.opencsv.*;
+import com.opencsv.exceptions.*;
+import java.io.FileReader;
+import java.io.FileNotFoundException;
+import java.util.Map;
+import java.util.List;
+import java.util.ArrayList;
+
+class Subsurvey
+{
+   private static final Logger LOGGER = Logger.getLogger("Subsurvey");
+
+   String description;
+   String surveyname;
+   String species;
+   String transition;
+   double rf; // rest frequency
+   String rf_unit;
+   String vel_unit;
+   String storage_path;
+
+
+   Subsurvey() {}
+   Subsurvey(Subsurvey ss)
+   {
+    this.description = ss.description;;
+    this.surveyname = ss.surveyname;
+    this.species = ss.species;
+    this.transition = ss.transition;
+    this.rf = ss.rf; 
+    this.rf_unit = ss.rf_unit;
+    this.vel_unit = ss.vel_unit;
+    this.storage_path = ss.storage_path;
+   }
+
+   String id() { return (this.surveyname + " " + this.species + " "  + this.transition); }
+
+   boolean matches(String id) { return id.equals(this.id()); }
+
+
+
+
+   static public Subsurvey findSubsurvey(Subsurvey[] dbSubsurveys, String subsurvey_id)
+   {
+      for(Subsurvey curr : dbSubsurveys)
+      {
+         if(curr.matches(subsurvey_id))
+         {
+            return curr;
+         }
+      }
+
+      throw new AssertionError(subsurvey_id + " not found in surveys table");
+   }
+
+
+
+
+   static public FitsCard[] subsurveysFindCards(Subsurvey[] subsurveys, String subsurveyId)
+   {
+      if(subsurveys == null) return null;
+
+      Subsurvey subsurvey = Subsurvey.findSubsurvey(subsurveys, subsurveyId);
+
+      FitsCard[] keyRecords = FitsCard.convertToFitsCard(subsurvey.rf, subsurvey.vel_unit);
+
+      return keyRecords;
+   }
+
+
+
+
+   public static Subsurvey[] loadSubsurveys(String csvFilename)
+   {
+      LOGGER.info("loadSubsurvey from: " + csvFilename);
+
+      /* avoid access files-system if csv-filename not configured */
+      if( (csvFilename == null) || ( (csvFilename != null) && (csvFilename.length() < 1) ) )
+      {
+         LOGGER.warning("csvFilename is null, metadata not loaded");
+         return null;
+      }
+
+      try
+      {
+         CSVReaderHeaderAware csvReader = new CSVReaderHeaderAware(new FileReader(csvFilename));
+
+         List<Subsurvey> subsurveyList = new ArrayList<>();
+         Map<String, String> values;
+
+         while ((values = csvReader.readMap()) != null)
+         {
+            Subsurvey subsurvey = new Subsurvey();
+
+            subsurvey.description   = values.get("description");
+            subsurvey.surveyname    = values.get("name");
+            subsurvey.species       = values.get("species");
+            subsurvey.transition    = values.get("transition");
+            subsurvey.rf            = Double.parseDouble(values.get("rest_frequency"));
+            subsurvey.rf_unit       = values.get("restf_fits_unit");
+            subsurvey.vel_unit      = values.get("velocity_fits_unit");
+            subsurvey.storage_path  = values.get("storage_path");
+
+            subsurveyList.add(subsurvey);
+         }
+
+         return subsurveyList.toArray(new Subsurvey[0]);
+
+      }
+      catch(IOException ex) 
+      {
+         LOGGER.info("Error while loading " + csvFilename + " -> " + ex.getMessage());
+         return null;
+         //throw new IllegalStateException("Error while loading " + csvFilename + " file", ex);
+      }
+      catch(CsvValidationException ex) 
+      {
+         LOGGER.info("Error while reading " + csvFilename + " -> " + ex.getMessage());
+         return null;
+         //throw new IllegalStateException("Error while reading " + csvFilename + " file", ex);
+      }
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/CutResult.java b/data-access/servlet/src/main/java/common/vo/CutResult.java
new file mode 100644
index 0000000000000000000000000000000000000000..7d1e0bb0e48fafc334b0a49068953103a7a00d5f
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/CutResult.java
@@ -0,0 +1,18 @@
+
+
+
+class CutResult
+{
+   String filename;
+   long filesize;
+   NullValueCount nullValueCount;
+
+
+   CutResult()
+   {
+      nullValueCount = new NullValueCount();
+   }
+
+}
+
+
diff --git a/data-access/servlet/src/main/java/common/vo/DataLink.java b/data-access/servlet/src/main/java/common/vo/DataLink.java
new file mode 100644
index 0000000000000000000000000000000000000000..321379dccc7009eaf01adb28ad0af77bd3ee9143
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/DataLink.java
@@ -0,0 +1,75 @@
+
+import java.util.logging.Logger;
+import java.util.List;
+import java.util.ArrayList;
+
+
+class DataLink
+{
+   private static final Logger LOGGER = Logger.getLogger("DataLink");
+
+   /* DataLink fields */
+
+   String id;
+   String accessUrl;
+   String serviceDef;
+   String errorMessage;
+   String description;
+   String semantics;
+   String contentType;
+   long  contentLength;
+
+   /* legacy-VLKB fields */
+
+   Inputs inputs;
+   String versionString;
+   String cut;
+   String absCutPathname;
+   int datacubeCount;
+   NullValueCount nullVals;
+   MCutResult[] mcutResultArr;
+
+
+   public DataLink()
+   {
+      this.nullVals = new NullValueCount();
+      this.versionString = Version.asString;
+      this.inputs = null;
+      this.datacubeCount = 1;
+   }
+
+   // FIXME fake, only to compile MCutout and Merge
+   public DataLink(CutResult cutResult)
+   {
+      this.nullVals = new NullValueCount();
+
+      this.id            = "_PIXEL_BOUNDS"; 
+      this.accessUrl     = cutResult.filename; // FIXME filename ->> remoteUrl
+      this.serviceDef    = null;
+      this.errorMessage  = null;
+      this.description   = "cutout_from ID";
+      this.semantics     = "FIXME find in IVOA docs...";
+      this.contentType   = "application/fits"; 
+      this.contentLength = cutResult.filesize;
+
+      // VLKB-extension to DataLink:
+      this.inputs         = null;
+      this.versionString  = Version.asString;
+      this.cut            = null;
+      this.absCutPathname = cutResult.filename;
+      this.datacubeCount  = 1;
+      this.nullVals       = cutResult.nullValueCount;
+      this.mcutResultArr  = null;
+   }
+
+   public String convertLocalPathnameToRemoteUrl(String localPathname, String FITScutpath, String FITSRemoteUrlCutouts)
+   {
+      String filename = localPathname.replaceAll(FITScutpath + "/", "");
+      LOGGER.info("local filename: " + filename);
+      String remotefname = FITSRemoteUrlCutouts + "/" + filename;
+      LOGGER.info("remote url    : " + remotefname);
+      return remotefname;
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/MCutResult.java b/data-access/servlet/src/main/java/common/vo/MCutResult.java
new file mode 100644
index 0000000000000000000000000000000000000000..5288da237b716068d2af188b2175f5fd46e298af
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/MCutResult.java
@@ -0,0 +1,12 @@
+
+
+
+class MCutResult
+{
+   public enum ContentType {FILENAME, BAD_REQUEST, SERVICE_ERROR};
+   public Inputs inputs;
+   public int index;
+   public ContentType contentType;
+   public String content;
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/NullValueCount.java b/data-access/servlet/src/main/java/common/vo/NullValueCount.java
new file mode 100644
index 0000000000000000000000000000000000000000..d265b3d11bd1c0fcb78266165b6dc94ed373bc13
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/NullValueCount.java
@@ -0,0 +1,11 @@
+
+
+
+class NullValueCount
+{
+   double percent;
+   long nullCount;
+   long totalCount;
+}
+
+
diff --git a/data-access/servlet/src/main/java/common/vo/soda/Band.java b/data-access/servlet/src/main/java/common/vo/soda/Band.java
new file mode 100644
index 0000000000000000000000000000000000000000..89630259c0ea1d762defc66ca5019ea2b6568280
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/Band.java
@@ -0,0 +1,35 @@
+
+/* REC-SODA BAND parameter defines the interval(s) to be extracted
+from the data using a floating point interval (xtype="interval") as defined
+in DALI. */
+
+
+class Band
+{
+   public enum System {WAVE_Barycentric, VELO_LSRK, NONE};
+
+   System system;
+   double wavelength[];
+
+
+   public Band(String str)
+   {
+      this.wavelength = Parser.getDaliIntervalPositiveValues(str, "BAND");
+   }
+
+   public Band(double low, double up)
+   {
+      wavelength = new double[2];
+      wavelength[0] = low;
+      wavelength[1] = up;
+   }
+
+   public void setSystem(Band.System system) { this.system = system; }
+
+   public String toString()
+   {
+      return "BAND " + wavelength[0] + " " + wavelength[1];
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/soda/Circle.java b/data-access/servlet/src/main/java/common/vo/soda/Circle.java
new file mode 100644
index 0000000000000000000000000000000000000000..dd5b406c9a42b930ded7276d89f47a611be6e686
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/Circle.java
@@ -0,0 +1,61 @@
+
+/* REC-SODA: UCD describing the CIRCLE parameter is pos.outline;obs.*/
+/* REC-DALI Sect 3.3.6 Circle: definition */
+
+
+class Circle
+{
+
+   double lon;
+   double lat;
+   double radius;
+
+   public Circle(String value)
+   {
+      parseCircle(value);
+   }
+
+   private void parseCircle(String str)
+   {
+      String[] arr = str.strip().split(" +");
+
+      if(arr == null)
+         throw new IllegalArgumentException("CIRCLE : no value, or value contains no space");
+      else
+      {
+         final int len = 3;
+         if(arr.length != len)
+            throw new IllegalArgumentException(
+                  "CIRCLE : must have " + len + " elements delimited by space, but found " + arr.length);
+         else
+         {
+
+            double dbl = Double.parseDouble(arr[0]);
+            if ((dbl < 0) || (dbl > 360))
+               throw new IllegalArgumentException("CIRCLE : first number must be in range [0,360] but found " + dbl);
+            else
+               this.lon = dbl;
+
+            dbl = Double.parseDouble(arr[1]);
+            if ((dbl < -90) || (dbl > 90))
+               throw new IllegalArgumentException("CIRCLE : second number must be in range [-90,90] but found " + dbl);
+            else
+               this.lat = dbl;
+
+            dbl = Double.parseDouble(arr[2]);
+            if ((dbl <= 0) || (dbl > 180))
+               throw new IllegalArgumentException("CIRCLE : third number must be in range (0,180] but found " + dbl);
+            else
+               this.radius = dbl;
+
+         }
+      }
+   }
+
+   public String toString()
+   {
+      return "CIRCLE " + Double.valueOf(lon) + " " + Double.valueOf(lat) + " " + Double.valueOf(radius);
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/soda/MultiValuedParamNotSupported.java b/data-access/servlet/src/main/java/common/vo/soda/MultiValuedParamNotSupported.java
new file mode 100644
index 0000000000000000000000000000000000000000..2b3dd3c6a1efaf93eeea9eccb4b359e0200ab8c4
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/MultiValuedParamNotSupported.java
@@ -0,0 +1,8 @@
+
+
+
+public class MultiValuedParamNotSupported  extends IllegalArgumentException {
+    public MultiValuedParamNotSupported(String errorMessage){//, Throwable err) {
+        super(errorMessage);//, err);
+    }
+}
diff --git a/data-access/servlet/src/main/java/common/vo/soda/Parser.java b/data-access/servlet/src/main/java/common/vo/soda/Parser.java
new file mode 100644
index 0000000000000000000000000000000000000000..66d0d67eac82d3a4928db6cf792b02fd0c708b6c
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/Parser.java
@@ -0,0 +1,90 @@
+
+import java.util.Map;
+
+
+class Parser
+{
+
+   public static String getFirstString(Map<String, String[]> params, String key)
+   {
+      String[] values = params.get(key);
+      if (values == null) return null;
+
+      if (values.length < 1)
+         throw new IllegalArgumentException(key + " has no valid value");
+      else
+         return values[0];// FIXME if values[0] is null -> canot distinguish from key not found
+   }
+
+
+   public static String[] getFirstStringArray(Map<String, String[]> params, String key, String separator, int arrayLength)
+   {
+      String array = getFirstString(params, key);
+      if (array == null) return null;
+
+      String[] stringArray = array.split(separator);
+
+      if(stringArray.length != arrayLength)
+         throw new IllegalArgumentException(
+               key + " parameter has incorrect number of elements (" 
+               + stringArray.length + " vs " + arrayLength + ") or incorrect separator used");
+
+      return stringArray;
+   }
+
+
+   public static double[] getDaliIntervalPositiveValues(String value, String errorMsgPrefix)
+   {
+      String[] arr = value.strip().split(" +");
+
+      double[] dblArr = new double[2];
+
+      if(arr == null)
+         throw new IllegalArgumentException(errorMsgPrefix + " : no value, or value contains no space");
+      else
+      {
+         final int len = 2;
+         if(arr.length != len)
+            throw new IllegalArgumentException(
+                  errorMsgPrefix + " : must have " + len + " space-delimited elements, but found " + arr.length);
+         else
+         {
+
+
+            String val = arr[0];
+            if(val.equals("Inf") || val.equals("+Inf"))
+            {
+               dblArr[0] = Double.POSITIVE_INFINITY;
+            }
+            else
+            {
+               double dbl = Double.parseDouble(val);
+               if (dbl < 0)
+                  throw new IllegalArgumentException(errorMsgPrefix + " : values must be positive, but first value was " + dbl);
+               else
+                  dblArr[0] = dbl;
+            }
+
+
+            val = arr[1];
+            if(val.equals("-Inf"))
+            {
+               // dblArr[1] = Double.NEGATIVE_INFINITY;
+               throw new IllegalArgumentException(errorMsgPrefix + " : values must be positive, but second value was " + val);
+            }
+            else
+            {
+               double dbl = Double.parseDouble(val);
+               if (dbl < 0)
+                  throw new IllegalArgumentException(errorMsgPrefix + " : values must be positive, but second value was " + dbl);
+               else
+                  dblArr[1] = dbl;
+            }
+         }
+      }
+
+      return dblArr;
+   }
+
+
+}
diff --git a/data-access/servlet/src/main/java/common/vo/soda/Pol.java b/data-access/servlet/src/main/java/common/vo/soda/Pol.java
new file mode 100644
index 0000000000000000000000000000000000000000..189fe7f0215accd06fad3f2d1a83d6f59532fb62
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/Pol.java
@@ -0,0 +1,40 @@
+
+/* REC ObsCore B.6.5.1. List of polarization states (pol_states)
+... polarization labels inspired from the FITS specification. See Table 7 in FITS WCS Paper 1
+(Greisen et Calabretta 2002) . Labels are combined using symbols from the
+
+{I Q U V RR LL RL LR XX YY XY YX POLI POLA} set
+
+and separated by a / character. A leading / character must start the list and
+a trailing / character must end it. It should be ordered following the above list, compatible with
+the FITS list table for polarization definition.
+*/
+
+// NOTE POLI POLA is not in FITS STOKES other match and are defined in FITS as: I=1...V=4 and  RR=-1...YX=-8
+
+import java.util.Arrays;
+import java.util.Set;
+import java.util.HashSet;
+
+class Pol
+{
+   final static Set<String> POL_STATES
+      = new HashSet<>(Arrays.asList("I", "Q", "U", "V", "RR", "LL", "RL", "LR", "XX", "YY", "XY", "YX"));
+
+   String[] states;
+
+   Pol(String[] values)
+   {
+      for(String pol : values)
+         if(!POL_STATES.contains(pol))
+            throw new IllegalArgumentException("POL value is " + pol +" but must be one of " + String.join(" ", POL_STATES));
+
+      this.states = values;
+   }
+
+
+   public String toString()
+   {
+      return "POL " + String.join(" ", states);
+   }
+}
diff --git a/data-access/servlet/src/main/java/common/vo/soda/Polygon.java b/data-access/servlet/src/main/java/common/vo/soda/Polygon.java
new file mode 100644
index 0000000000000000000000000000000000000000..745b58105acb01dcf213194c11a85688444e2990
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/Polygon.java
@@ -0,0 +1,89 @@
+/*  REC DALI 3.3.7 Polygon
+In spherical coordinates, all longitude values must fall within [0,360] and
+all latitude values within [-90,90]. Vertices must be ordered such that the
+polygon winding direction is counter-clockwise (when viewed from the origin
+toward the sky) as described in (Rots, 2007).
+
+
+Rots, A. (2007), 'Space-time coordinate metadata for the virtual observa-
+tory', IVOA Recommendation.
+http://www.ivoa.net/documents/latest/STC.html :
+
+4.5.1.4 Polygon
+....
+  In order to avoid ambiguities in direction, vertices need to be less
+than 180 deg apart in both coordinates. Great circles or small circles spanning 180 deg
+require specification of an extra intermediate vertex.
+....
+The boundaries are considered part of the region. The inside of the region is
+defined as that part of coordinate space that is encircled by the polygon in a
+counter-clockwise sense.
+...
+*/
+
+class Polygon
+{
+
+   double[] lon;
+   double[] lat;
+
+   public Polygon(String value)
+   {
+      parsePolygon(value);
+   }
+
+   private void parsePolygon(String str)
+   {
+      String[] arr = str.strip().split(" +");
+
+      if(arr == null)
+         throw new IllegalArgumentException("POLYGON : no value, or value contains no space");
+      else
+      {
+         final int minLen = 3*2; // REC SODA : at least 3 (lon,lat) points
+         if(arr.length < minLen)
+            throw new IllegalArgumentException(
+                  "POLYGON : must have at least " + minLen + " elements delimited by space, but found " + arr.length);
+         else
+         {
+            boolean isEven = ((arr.length % 2) == 0);
+            if(!isEven)
+               throw new IllegalArgumentException("POLYGON must have even number of values, but has " + arr.length);
+
+            lon = new double[arr.length/2];
+            lat = new double[arr.length/2];
+
+            for(int ii=0; ii<(arr.length-1); ii+=2)
+            {
+
+               double dbl = Double.parseDouble(arr[ii]);
+               if ((dbl < 0) || (dbl > 360))
+                  throw new IllegalArgumentException("POLYGON : first number must be in range [0,360] but found " + dbl);
+               else
+                  this.lon[ii/2] = dbl;
+
+               dbl = Double.parseDouble(arr[ii+1]);
+               if ((dbl < -90) || (dbl > 90))
+                  throw new IllegalArgumentException("POLYGON : second number must be in range [-90,90] but found " + dbl);
+               else
+                  this.lat[ii/2] = dbl;
+
+            }
+
+         }
+      }
+   }
+
+
+   public String toString()
+   {
+      StringBuilder sb = new StringBuilder("POLYGON");
+      int ii = 0;
+      for(ii = 0; ii<lon.length; ii++)
+         sb.append(" (" + String.valueOf(lon[ii]) + ", " + String.valueOf(lat[ii]) + ")");
+
+      return sb.toString();
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/soda/Pos.java b/data-access/servlet/src/main/java/common/vo/soda/Pos.java
new file mode 100644
index 0000000000000000000000000000000000000000..55aaf524e425c97cc451606945cd2a326fdc377f
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/Pos.java
@@ -0,0 +1,93 @@
+
+import java.util.logging.Logger;
+
+
+class Pos
+{
+   protected static final Logger LOGGER = Logger.getLogger("Pos");
+
+   enum System {NONE, ICRS, GALACTIC};
+
+   String shape;
+   String value;
+
+   System system;
+
+   Circle  circle;
+   Range   range;
+   Polygon polygon;
+
+
+   public Pos(Circle circle)   {this.shape = "CIRCLE";  this.circle  = circle;};
+   public Pos(Range range)     {this.shape = "RANGE";   this.range  = range;};
+   public Pos(Polygon polygon) {this.shape = "POLYGON"; this.polygon = polygon;};
+
+
+   public Pos(String value)
+   {
+      LOGGER.info("trace: " + value);
+      parsePos(value);
+   }
+
+
+   private void parsePos(String str)
+   {
+      String strArr[] = str.strip().split(" +", 2);
+
+      if(strArr.length > 1)
+      {
+         this.shape = strArr[0].strip();
+         this.value = strArr[1].strip();
+      }
+      else
+      {
+         throw new IllegalArgumentException("POS value must have more then one space-separated elements but had "
+               + strArr.length + " elements)");
+      }
+
+      if(this.shape.equals("CIRCLE"))
+      {
+         this.circle = new Circle(this.value);
+      }
+      else if(this.shape.equals("RANGE"))
+      {
+         this.range = new Range(this.value);
+      }
+      else if(this.shape.equals("POLYGON"))
+      {
+         this.polygon = new Polygon(this.value);
+      }
+      else
+      {
+         throw new IllegalArgumentException("Valid POS shape is CIRCLE or RANGE or POLYGON but was: " + this.shape);
+      }
+
+   }
+
+   public void setSystem(Pos.System system) { this.system = system; }
+
+   public String toString()
+   {
+      String shapeStr;
+      if(this.shape.equals("CIRCLE"))
+      {
+         shapeStr = this.circle.toString();
+      }
+      else if(this.shape.equals("RANGE"))
+      {
+         shapeStr = this.range.toString();
+      }
+      else if(this.shape.equals("POLYGON"))
+      {
+         shapeStr = this.polygon.toString();
+      }
+      else
+      {
+         throw new IllegalArgumentException("Valid POS shape is CIRCLE or RANGE or POLYGON but was: " + this.shape);
+      }
+
+     return "POS: " + shapeStr;
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/soda/Range.java b/data-access/servlet/src/main/java/common/vo/soda/Range.java
new file mode 100644
index 0000000000000000000000000000000000000000..8f74921f091e22f3e938a39fdb0dc4d650a1083c
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/Range.java
@@ -0,0 +1,98 @@
+
+import java.util.Arrays;
+
+class Range
+{
+
+   double lon1, lon2;
+   double lat1, lat2;
+
+   public Range(String value)
+   {
+      parseRange(value);
+   }
+
+   public Range(double lonCenter, double latCenter, double lonWidth, double latWidth)
+   {
+      lon1 = lonCenter - lonWidth/2.0;
+      lon2 = lonCenter + lonWidth/2.0;
+      lat1 = latCenter - latWidth/2.0;
+      lat2 = latCenter + latWidth/2.0;
+   }
+
+   public Range(Circle circle)
+   {
+      lon1 = circle.lon - circle.radius;
+      lon2 = circle.lon + circle.radius;
+
+      lat1 = circle.lat - circle.radius;
+      lat2 = circle.lat + circle.radius;
+   }
+
+
+
+   public Range(Polygon polygon)
+   {
+      lon1 = Arrays.stream(polygon.lon).min().getAsDouble();
+      lon2 = Arrays.stream(polygon.lon).max().getAsDouble();
+
+      lat1 = Arrays.stream(polygon.lat).min().getAsDouble();
+      lat2 = Arrays.stream(polygon.lat).max().getAsDouble();
+   }
+
+
+
+   private void parseRange(String str)
+   {
+      String[] arr = str.strip().split(" +");
+
+      if(arr == null)
+         throw new IllegalArgumentException("RANGE : no value, or value contains no space");
+      else
+      {
+         final int len = 4;
+         if(arr.length != len)
+            throw new IllegalArgumentException(
+                  "RANGE : must have " + len + " elements delimited by space, but found " + arr.length);
+         else
+         {
+
+            double dbl = Double.parseDouble(arr[0]);
+            if ((dbl < 0) || (dbl > 360))
+               throw new IllegalArgumentException("RANGE : first number must be in range [0,360] but found " + dbl);
+            else
+               this.lon1 = dbl;
+
+            dbl = Double.parseDouble(arr[1]);
+            if ((dbl < 0) || (dbl > 360))
+               throw new IllegalArgumentException("RANGE : first number must be in range [0,360] but found " + dbl);
+            else
+               this.lon2 = dbl;
+
+
+            dbl = Double.parseDouble(arr[2]);
+            if ((dbl < -90) || (dbl > 90))
+               throw new IllegalArgumentException("RANGE : second number must be in range [-90,90] but found " + dbl);
+            else
+               this.lat1 = dbl;
+
+            dbl = Double.parseDouble(arr[3]);
+            if ((dbl < -90) || (dbl > 90))
+               throw new IllegalArgumentException("RANGE : second number must be in range [-90,90] but found " + dbl);
+            else
+               this.lat2 = dbl;
+
+         }
+      }
+   }
+
+
+   public String toString()
+   {
+      String str = "RANGE " + Double.valueOf(lon1) + " " + Double.valueOf(lon2)
+         + " " + Double.valueOf(lat1) + " " + Double.valueOf(lat2);
+      return str;
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/soda/SodaParam.java b/data-access/servlet/src/main/java/common/vo/soda/SodaParam.java
new file mode 100644
index 0000000000000000000000000000000000000000..6abd8445b448150c48fefcdf8cbdbdce3440ca65
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/SodaParam.java
@@ -0,0 +1,11 @@
+
+
+/* FIXME contains also VLKB-legacy params which will be removed once clients can do SODA */
+
+public enum SodaParam
+{
+ID, POS, CIRCLE, POLYGON, BAND, TIME, POL,
+skysystem, specsystem,
+pubdid,l,b,r,dl,db,vtype,vl,vu,nullvals
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/soda/SodaParser.java b/data-access/servlet/src/main/java/common/vo/soda/SodaParser.java
new file mode 100644
index 0000000000000000000000000000000000000000..76378ccfeaf39e68317b3f6081b486aca9111f83
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/SodaParser.java
@@ -0,0 +1,232 @@
+
+import java.util.logging.Logger;
+
+import java.util.Map;
+import java.util.HashMap;
+import java.util.Arrays;
+
+
+
+
+class SodaParser
+{
+   protected static final Logger LOGGER = Logger.getLogger(ServletCutout.class.getName());
+
+   //public class SodaParamMap extends HashMap<SodaParam, String[]> {};
+   Map<SodaParam, String[]> params;
+
+
+   Pos pos;
+   Band band;
+   Time time;
+   Pol pol;
+
+
+   public SodaParser(Map<SodaParam, String[]> params)
+   {
+      LOGGER.info("trace - there are " + params.size() + " params");
+      this.params = params;
+   }
+
+
+
+   /* return null if value not present or the value if present exactly once
+    * else throw MultiplicityNotSupoorted SODA_error
+    */ 
+   private  String soda_getSingleValue(SodaParam name)
+   {
+      LOGGER.info("trace");
+
+      String[] valArr = params.get(name);
+
+      if(valArr == null)
+         return null;
+      else
+         if(valArr.length == 0)
+            return null;
+         else if(valArr.length == 1)
+         {
+            LOGGER.info("ParamFound " + name.toString() + " : " + valArr[0]);
+            return valArr[0];
+         }
+         else
+            throw new MultiValuedParamNotSupported(name + " was found " + valArr.length + " times");
+   }
+
+/*
+   public String sodaReq_getResponseFormat(SodaParamMap params, String defaultResponseFormat)
+   {
+      String respFormat = soda_getSingleValue(params, "RESPONSEFORMAT");
+      return ((respFormat == null) ? defaultResponseFormat : respFormat);
+   }
+*/
+
+   public  boolean sodaReq_hasSodaId()
+   {
+      String id = soda_getSingleValue(SodaParam.ID);
+      return (id != null);
+   }
+
+
+   public  String sodaReq_getId()
+   {
+      String pubdid = soda_getSingleValue(SodaParam.ID);
+      if(pubdid == null)
+         throw new IllegalArgumentException("ID is missing, but is mandatory");
+      else
+         return pubdid;
+   }
+
+  public  Pos  sodaReq_getPosCirclePolygon()
+   {
+      String valuePos     = soda_getSingleValue(SodaParam.POS);
+      String valueCircle  = soda_getSingleValue(SodaParam.CIRCLE);
+      String valuePolygon = soda_getSingleValue(SodaParam.POLYGON);
+
+      Pos pos = null;
+
+      if( (valuePos != null) && (valueCircle == null) && (valuePolygon == null) )
+      {   
+         pos = new Pos(valuePos);
+         LOGGER.info(pos.toString());
+      }   
+      else if( (valuePos == null) && (valueCircle != null) && (valuePolygon == null) )
+      {   
+         Circle circle = new Circle(valueCircle);
+         LOGGER.info(circle.toString());
+         pos = new Pos(circle);
+      }   
+      else if( (valuePos == null) && (valueCircle == null) && (valuePolygon != null) )
+      {
+         Polygon polygon = new Polygon(valuePolygon);
+         LOGGER.info(polygon.toString());
+         pos = new Pos(polygon);
+      }
+      else
+      {
+         throw new IllegalArgumentException("Exactly one of POS | CIRCLE | POLYGON must be given.");
+      }
+
+      return pos;
+   }
+
+
+   public  Band sodaReq_getBand()
+   {
+      String value = soda_getSingleValue(SodaParam.BAND);
+      if(value == null)
+         return null;
+      else
+         return new Band(value);
+   }
+
+
+   public  Time sodaReq_getTime()
+   {
+      String value = soda_getSingleValue(SodaParam.TIME);
+      if(value == null)
+         return null;
+      else
+         return new Time(value);
+   }
+
+  public  Pol sodaReq_getPol()
+   {
+      String[] valArr = params.get(SodaParam.POL);
+
+
+      if(valArr == null)
+         return null;
+      else if(valArr.length < 1)
+         return null;
+      else
+      {
+         LOGGER.info("ParamFound " + SodaParam.POL.toString() + " : " + Arrays.toString(valArr));
+         return new Pol(valArr);
+      }
+   }
+
+
+  /* VLKB */
+
+
+  public  String vlkbReq_getPubdid()
+  {
+     String pubdid = soda_getSingleValue(SodaParam.pubdid);
+     if(pubdid == null)
+        throw new IllegalArgumentException(SodaParam.pubdid.toString() + " is missing, but is mandatory");
+     else
+        return pubdid;
+  }
+
+  public  Pos vlkbReq_getCircleRect()
+  {
+     Pos pos = null;
+
+     String l_value = soda_getSingleValue(SodaParam.l);
+     String b_value = soda_getSingleValue(SodaParam.b);
+
+     if( (l_value != null) && (b_value != null ) )
+     {
+        String r_value = soda_getSingleValue(SodaParam.r);
+        if(r_value != null)
+        {
+           Circle circle = new Circle(l_value + " " + b_value + " " + r_value);
+           pos = new Pos(circle);
+        }
+        else
+        {
+           String dl_value = soda_getSingleValue(SodaParam.dl);
+           String db_value = soda_getSingleValue(SodaParam.db);
+           if((dl_value != null) && (db_value != null))
+           {
+              double l  = Double.parseDouble(l_value);
+              double b  = Double.parseDouble(b_value);
+              double dl = Double.parseDouble(dl_value);
+              double db = Double.parseDouble(db_value);
+
+              Range range = new Range(l, b, dl, db);
+              pos = new Pos(range);
+           }
+           else
+           {
+              throw new IllegalArgumentException("one of 'r' or '(dl,db)' pair must be provided to designate sky area");
+           }
+        }
+     }
+     else
+     {
+        throw new IllegalArgumentException("VLKB sky position center (l,b) is missing however is mandatory");
+     }
+
+     return pos;
+  }
+
+
+  public  Band vlkbReq_getVelocity()
+  {
+     Band band = null;
+
+     String cvlow = soda_getSingleValue(SodaParam.vl);
+     String cvup  = soda_getSingleValue(SodaParam.vu);
+     //String cvtype = soda_getSingleValue(params, "vtype"); // "1"=VELO_LSRK or "2"=WAVE_Barycentirc
+
+     boolean vel_valid = (cvlow != null) && (cvup != null);
+
+     if(vel_valid)
+     {
+        double vel_low = Double.parseDouble(cvlow);
+        double vel_up  = Double.parseDouble(cvup);
+        band = new Band(vel_low, vel_up);
+     }
+
+     return band;
+  }
+
+  public  boolean vlkbReq_getNullValues()
+  {
+     return (null != soda_getSingleValue(SodaParam.nullvals));
+  }
+
+}
+
diff --git a/data-access/servlet/src/main/java/common/vo/soda/Time.java b/data-access/servlet/src/main/java/common/vo/soda/Time.java
new file mode 100644
index 0000000000000000000000000000000000000000..69afc8c7ef8707455a7556e2283d30321935c89a
--- /dev/null
+++ b/data-access/servlet/src/main/java/common/vo/soda/Time.java
@@ -0,0 +1,27 @@
+
+/* SODA 3.3.5 TIME
+...numeric values interpreted as Modified Julian Date(s) in UTC.
+As in DALI, open intervals use -Inf or +Inf as one limit.
+*/
+
+class Time
+{
+   enum System {MJD_UTC, NONE};
+
+   System system;
+   double mjdUtc[];
+
+   public Time(String value)
+   {
+      mjdUtc = Parser.getDaliIntervalPositiveValues(value, "TIME");
+   }
+
+   public void setSystem(Time.System system) { this.system = system; }
+
+   public String toString()
+   {
+      return "TIME " + mjdUtc[0] + " " + mjdUtc[1];
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/Cutout.java b/data-access/servlet/src/main/java/datasets/Cutout.java
new file mode 100644
index 0000000000000000000000000000000000000000..0ef94405e8cc5a6ccc8efc0a74ed8b653ee42035
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/Cutout.java
@@ -0,0 +1,23 @@
+
+
+import java.io.OutputStream;
+import java.io.InputStream;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+/* Jdl */
+import java.io.InputStreamReader;
+import java.io.PrintWriter;
+import java.time.Instant;//Timestamp in cut-filename
+
+public interface Cutout
+{
+
+   public void doStream(String relPathname, int hdunum, Pos pos, Band band, Time time, Pol pol,
+        OutputStream outputStream) throws IOException, InterruptedException;
+
+
+   public CutResult doFile(String relPathname, int hdunum, Pos pos, Band band, Time time, Pol pol,
+         boolean countNullValues, FitsCard[] extraCards);
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/CutoutImpl.java b/data-access/servlet/src/main/java/datasets/CutoutImpl.java
new file mode 100644
index 0000000000000000000000000000000000000000..a40de95062f8f2373e67ca2da08fc76cad60fe99
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/CutoutImpl.java
@@ -0,0 +1,272 @@
+
+import java.util.logging.Logger;
+import java.util.logging.Level;
+import java.util.List;
+import java.util.ArrayList;
+import java.util.Arrays;
+
+import java.time.Instant;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.io.InputStreamReader;
+import java.io.OutputStreamWriter;
+import java.io.PrintWriter;
+import java.io.BufferedReader;
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileNotFoundException;
+import java.nio.file.StandardOpenOption;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+
+import java.time.*;// Timestamp in cut-filename
+import java.io.ByteArrayOutputStream; // for SODA direct streaming doSubimgStream
+
+class CutoutImpl implements Cutout
+{
+   static final Logger LOGGER = Logger.getLogger(DatasetsImpl.class.getName());
+
+   private Settings    settings   = null;
+//   private Subsurvey[] subsurveys = null;
+
+
+   public CutoutImpl()
+   {
+      LOGGER.info("trace DatasetsImpl()");
+      this.settings = Settings.getInstance();
+   }
+
+
+   public CutoutImpl(Settings settings)
+   {
+      LOGGER.info("trace DatasetsImpl(settings)");
+      this.settings = settings;
+   }
+
+/*
+   public DatasetsImpl(Settings settings, Subsurvey[] subsurveys)
+   {
+      LOGGER.info("trace DatasetsImpl(settings, subsurveys)");
+      this.settings = settings;
+      this.subsurveys = subsurveys;
+   }
+*/
+
+   private String genRegionForVlkbOverlapCmd(Pos pos, Band band)
+   {
+      String region = "";
+
+      if(pos != null)
+      {
+         String skySystem = pos.system.name();
+
+         if(pos.shape.equals("CIRCLE"))
+         {
+            double l = pos.circle.lon;
+            double b = pos.circle.lat;
+            double r = pos.circle.radius;
+            region = region + "skysystem=" + skySystem + "&l=" + String.valueOf(l) + "&b=" + String.valueOf(b)
+               + "&r=" + String.valueOf(r);
+         }
+         else if(pos.shape.equals("RANGE"))
+         {
+            double l =  (pos.range.lon1 + pos.range.lon2)/2.0;
+            double b =  (pos.range.lat1 + pos.range.lat2)/2.0;
+            double dl = (pos.range.lon2 - pos.range.lon1);
+            double db = (pos.range.lat2 - pos.range.lat1);
+            region = region + "skysystem=" + skySystem + "&l=" + String.valueOf(l) + "&b=" + String.valueOf(b)
+               + "&dl=" + String.valueOf(dl) + "&db=" + String.valueOf(db);
+         }
+         else 
+         {
+            LOGGER.info("FIXME here Exception: POLYGON not supported or pos.shape invalid: " + pos.shape);
+         }
+
+      }
+
+      if(band != null)
+      {
+         String specSystem = band.system.name();
+         double vl = band.wavelength[0];
+         double vu = band.wavelength[1];
+
+         region =region + "specsystem=" + specSystem + "&vl=" + String.valueOf(vl) + "&vu=" + String.valueOf(vu);
+      }
+
+      return region;
+   }
+
+   public void doStream(String relPathname, int hdunum, Pos pos, Band band, Time time, Pol pol,
+         OutputStream outputStream)  throws IOException, InterruptedException
+      {
+         Instant start = Instant.now();
+
+         ByteArrayOutputStream bos = new ByteArrayOutputStream();
+         if(bos == null)
+            throw new AssertionError("byte output stream for bounds was not created, is null");
+
+         //String coordString = genRegionForVlkbOverlapCmd(pos, band);
+         JsonEncoder jReq = new JsonEncoder();
+         jReq.add(pos);
+         jReq.add(band);
+         jReq.add(time);
+         jReq.add(pol);
+         String coordString = jReq.toString();
+         LOGGER.info("coordString: " + coordString);
+
+         String absPathname = settings.fitsPaths.surveys() + "/" + relPathname;
+
+         /* calc bounds */
+
+         String[] cmdBounds = new String[4];
+         cmdBounds[0] = "/usr/local/bin/vlkb";
+         cmdBounds[1] = "overlap";
+         cmdBounds[2] = absPathname;
+         cmdBounds[3] = coordString;
+
+         ExecCmd execBounds = new ExecCmd();
+         execBounds.doRun(bos, cmdBounds);
+         LOGGER.info("execBounds exitValue: " + execBounds.exitValue);
+
+         bos.close();
+
+         boolean has_result = (execBounds.exitValue == 0);
+
+         Instant boundsDone = Instant.now();
+         LOGGER.info("EXECTIME boundsDone: " + Duration.between(start, boundsDone));
+
+         if(has_result)
+         {
+            String boundsString = new String(bos.toByteArray());
+            // remove end-of-line (was added by vlkb_ast.cpp: cout << ... << endl)
+            String lineSeparator = System.lineSeparator();
+            boundsString = boundsString.replace(lineSeparator, "");
+            LOGGER.info("BOUNDS: " + boundsString);
+
+            if((boundsString != null) && boundsString.trim().isEmpty())
+            {
+               throw new IllegalArgumentException(
+                     "region in file does not overlap with region defined by SODA parameters");
+            }
+            else
+            {
+               /* cutout -> outputStream */
+
+               String[] cmdCut = new String[6];
+               cmdCut[0] = "/usr/local/bin/vlkb";
+               cmdCut[1] = "imcopy";
+               cmdCut[2] = absPathname;
+               cmdCut[3] = String.valueOf(hdunum-1);
+               cmdCut[4] = boundsString;
+               cmdCut[5] = settings.fitsPaths.cutouts();
+
+               if(outputStream == null)
+                  LOGGER.info("supplied outputStream for cut-file is null");
+
+               ExecCmd execCut = new ExecCmd();
+               execCut.doRun(outputStream, cmdCut);
+
+               Instant cutDone = Instant.now();
+               LOGGER.info("EXECTIME    cutDone: " + Duration.between(start, cutDone));
+            }
+         }
+         else
+         {
+            throw new IllegalArgumentException(
+                  "overlap computation could not be completed with the given arguments");
+         }
+      }
+
+
+   public CutResult doFile(String relPathname, int hdunum,
+         Pos pos, Band band, Time time, Pol pol,
+         boolean countNullValues, FitsCard[] extraCards)
+   {
+      LOGGER.info("trace: " + pos.toString() );
+
+      String absSubimgPathname = settings.fitsPaths.cutouts() + "/" + generateSubimgPathname(relPathname, hdunum);
+
+      JsonEncoder jReq = new JsonEncoder();
+      jReq.add(relPathname, hdunum);
+      jReq.add(pos);
+      jReq.add(band);
+      jReq.add(time);
+      jReq.add(pol);
+      jReq.add(countNullValues);
+      jReq.add(extraCards);
+
+      String outJson = doRpc( jReq.toString() );
+
+      return JsonDecoder.responseFromCutoutJson( outJson );
+   }
+
+
+
+   private String doRpc(String InStr)
+   {
+      final String userName = "guest";
+      final String password = "guest";
+      // FIXME move these to Settings
+
+      RpcOverAmqp rpc = new RpcOverAmqp(
+            userName, password,
+            settings.amqpConn.hostName(),
+            settings.amqpConn.portNumber(),
+            settings.amqpConn.routingKey());
+
+      rpc.initConnectionAndReplyQueue();
+
+      String OutStr = null;
+
+      try
+      {
+         LOGGER.info("Sent request : " + InStr);
+         OutStr = rpc.callAndWaitReply(InStr);
+         LOGGER.info("Got response : " + OutStr);
+      }
+      catch  (Exception e)
+      {
+         e.printStackTrace();
+      }
+      finally
+      {
+         try
+         {
+            rpc.close();
+         }
+         catch (Exception ignore)
+         {
+            LOGGER.info("ignoring exception on rpc.close():" + ignore.getMessage());
+         }
+      }
+
+      return OutStr;
+   }
+
+
+   private  String generateSubimgPathname(String relPathname, int hdunum)
+   {
+      String cutfitsname = "vlkb-cutout";
+
+      Instant instant = Instant.now() ;
+      String timestamp = instant.toString().replace(":","-").replace(".","_");
+
+      String tempPathname1 = relPathname.replaceAll("/","-");
+      String tempPathname2 = tempPathname1.replaceAll(" ","_");
+
+      if(hdunum == 1)
+      {
+         return cutfitsname + "_" + timestamp + "_" + tempPathname2;
+      }
+      else
+      {
+         String extnum = "EXT" + String.valueOf(hdunum-1);
+         return cutfitsname + "_" + timestamp + "_" + extnum + "_" + tempPathname2;
+      }
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/CutoutImpl.java.backup.notes b/data-access/servlet/src/main/java/datasets/CutoutImpl.java.backup.notes
new file mode 100644
index 0000000000000000000000000000000000000000..7b97747cb3811f19673989ac4a2fce9bfcd74971
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/CutoutImpl.java.backup.notes
@@ -0,0 +1,292 @@
+
+import java.util.logging.Logger;
+import java.util.logging.Level;
+import java.util.List;
+import java.util.ArrayList;
+import java.util.Arrays;
+
+import java.time.Instant;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.io.InputStreamReader;
+import java.io.OutputStreamWriter;
+import java.io.PrintWriter;
+import java.io.BufferedReader;
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileNotFoundException;
+import java.nio.file.StandardOpenOption;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+
+import java.time.*;// Timestamp in cut-filename
+import java.io.ByteArrayOutputStream; // for SODA direct streaming doSubimgStream
+
+class CutoutImpl implements Cutout
+{
+   static final Logger LOGGER = Logger.getLogger(DatasetsImpl.class.getName());
+
+   private Settings    settings   = null;
+//   private Subsurvey[] subsurveys = null;
+
+
+   public CutoutImpl()
+   {
+      LOGGER.info("trace DatasetsImpl()");
+      this.settings = Settings.getInstance();
+   }
+
+
+   public CutoutImpl(Settings settings)
+   {
+      LOGGER.info("trace DatasetsImpl(settings)");
+      this.settings = settings;
+   }
+
+/*
+   public DatasetsImpl(Settings settings, Subsurvey[] subsurveys)
+   {
+      LOGGER.info("trace DatasetsImpl(settings, subsurveys)");
+      this.settings = settings;
+      this.subsurveys = subsurveys;
+   }
+*/
+
+   private String genRegionForVlkbOverlapCmd(Pos pos, Band band)
+   {
+      String region = "";
+
+      if(pos != null)
+      {
+         String skySystem = pos.system.name();
+
+         if(pos.shape.equals("CIRCLE"))
+         {
+            double l = pos.circle.lon;
+            double b = pos.circle.lat;
+            double r = pos.circle.radius;
+            region = region + "skysystem=" + skySystem + "&l=" + String.valueOf(l) + "&b=" + String.valueOf(b)
+               + "&r=" + String.valueOf(r);
+         }
+         else if(pos.shape.equals("RANGE"))
+         {
+            double l =  (pos.range.lon1 + pos.range.lon2)/2.0;
+            double b =  (pos.range.lat1 + pos.range.lat2)/2.0;
+            double dl = (pos.range.lon2 - pos.range.lon1);
+            double db = (pos.range.lat2 - pos.range.lat1);
+            region = region + "skysystem=" + skySystem + "&l=" + String.valueOf(l) + "&b=" + String.valueOf(b)
+               + "&dl=" + String.valueOf(dl) + "&db=" + String.valueOf(db);
+         }
+         else 
+         {
+            LOGGER.info("FIXME here Exception: POLYGON not supported or pos.shape invalid: " + pos.shape);
+         }
+
+      }
+
+      if(band != null)
+      {
+         String specSystem = band.system.name();
+         double vl = band.wavelength[0];
+         double vu = band.wavelength[1];
+
+         region =region + "specsystem=" + specSystem + "&vl=" + String.valueOf(vl) + "&vu=" + String.valueOf(vu);
+      }
+
+      return region;
+   }
+
+   public void doStream(String relPathname, int hdunum, Pos pos, Band band, Time time, Pol pol,
+         OutputStream outputStream)  throws IOException, InterruptedException
+      {
+         Instant start = Instant.now();
+
+         ByteArrayOutputStream bos = new ByteArrayOutputStream();
+         if(bos == null)
+            throw new AssertionError("byte output stream for bounds was not created, is null");
+
+         //String coordString = genRegionForVlkbOverlapCmd(pos, band);
+         JsonEncoder jReq = new JsonEncoder();
+         jReq.add(pos);
+         jReq.add(band);
+         jReq.add(time);
+         jReq.add(pol);
+         String coordString = jReq.toString();
+         LOGGER.info("coordString: " + coordString);
+
+         String absPathname = settings.fitsPaths.surveys() + "/" + relPathname;
+
+         /* calc bounds */
+
+         String[] cmdBounds = new String[4];
+         cmdBounds[0] = "/usr/local/bin/vlkb";
+         cmdBounds[1] = "overlap";
+         cmdBounds[2] = absPathname;
+         cmdBounds[3] = coordString;
+
+         ExecCmd execBounds = new ExecCmd();
+         execBounds.doRun(bos, cmdBounds);
+         LOGGER.info("execBounds exitValue: " + execBounds.exitValue);
+
+         bos.close();
+
+         boolean has_result = (execBounds.exitValue == 0);
+
+         Instant boundsDone = Instant.now();
+         LOGGER.info("EXECTIME boundsDone: " + Duration.between(start, boundsDone));
+
+         if(has_result)
+         {
+            String boundsString = new String(bos.toByteArray());
+            // remove end-of-line (was added by vlkb_ast.cpp: cout << ... << endl)
+            String lineSeparator = System.lineSeparator();
+            boundsString = boundsString.replace(lineSeparator, "");
+            LOGGER.info("BOUNDS: " + boundsString);
+
+            if((boundsString != null) && boundsString.trim().isEmpty())
+            {
+               throw new IllegalArgumentException(
+                     "region in file does not overlap with region defined by SODA parameters");
+            }
+            else
+            {
+               /* cutout -> outputStream */
+
+               final long long PIX_LIMIT = 50*1024*1024; // char1byte 50MB ... 400MB double8bytes
+    1,           boolean use_file = ( pixelCount(boundsString) > PIX_LIMIT );
+
+               if(use_file)
+               {
+    2,             outFitsname = genTempFitsfilename();// cfitsio uses file (does not allocate mem)
+                  XXoutputStream = null;
+               }
+               else
+               {
+                  outFitsname = "stdout://"; // cfitsio allocates mem for totpix to stream to stdout
+                  XXoutputStream = outputStream;
+               }
+
+
+               final String outFitsname = "(stdout://)";
+               final String extNum = "[" + String.valueOf(hdunum-1) + "]";
+
+               String[] cmdCut = new String[3];
+               cmdCut[0] = "/usr/local/bin/vlkb";
+               cmdCut[1] = "imcopy";
+               cmdCut[2] = absPathname + outFitsname  + extNum + boundsString;
+
+               if(outputStream == null)
+                  LOGGER.info("supplied outputStream for cut-file is null");
+
+               ExecCmd execCut = new ExecCmd();
+               execCut.doRun(XXoutputStream, cmdCut);
+
+               if(use_file)
+               {
+    3,              outFistname.transferTo(outputStream);
+               }
+
+               Instant cutDone = Instant.now();
+               LOGGER.info("EXECTIME    cutDone: " + Duration.between(start, cutDone));
+            }
+         }
+         else
+         {
+            throw new IllegalArgumentException(
+                  "overlap computation could not be completed with the given arguments");
+         }
+      }
+
+
+   public CutResult doFile(String relPathname, int hdunum,
+         Pos pos, Band band, Time time, Pol pol,
+         boolean countNullValues, FitsCard[] extraCards)
+   {
+      LOGGER.info("trace: " + pos.toString() );
+
+      String absSubimgPathname = settings.fitsPaths.cutouts() + "/" + generateSubimgPathname(relPathname, hdunum);
+
+      JsonEncoder jReq = new JsonEncoder();
+      jReq.add(relPathname, hdunum);
+      jReq.add(pos);
+      jReq.add(band);
+      jReq.add(time);
+      jReq.add(pol);
+      jReq.add(countNullValues);
+      jReq.add(extraCards);
+
+      String outJson = doRpc( jReq.toString() );
+
+      return JsonDecoder.responseFromCutoutJson( outJson );
+   }
+
+
+
+   private String doRpc(String InStr)
+   {
+      final String userName = "guest";
+      final String password = "guest";
+      // FIXME move these to Settings
+
+      RpcOverAmqp rpc = new RpcOverAmqp(
+            userName, password,
+            settings.amqpConn.hostName(),
+            settings.amqpConn.portNumber(),
+            settings.amqpConn.routingKey());
+
+      rpc.initConnectionAndReplyQueue();
+
+      String OutStr = null;
+
+      try
+      {
+         LOGGER.info("Sent request : " + InStr);
+         OutStr = rpc.callAndWaitReply(InStr);
+         LOGGER.info("Got response : " + OutStr);
+      }
+      catch  (Exception e)
+      {
+         e.printStackTrace();
+      }
+      finally
+      {
+         try
+         {
+            rpc.close();
+         }
+         catch (Exception ignore)
+         {
+            LOGGER.info("ignoring exception on rpc.close():" + ignore.getMessage());
+         }
+      }
+
+      return OutStr;
+   }
+
+
+   private  String generateSubimgPathname(String relPathname, int hdunum)
+   {
+      String cutfitsname = "vlkb-cutout";
+
+      Instant instant = Instant.now() ;
+      String timestamp = instant.toString().replace(":","-").replace(".","_");
+
+      String tempPathname1 = relPathname.replaceAll("/","-");
+      String tempPathname2 = tempPathname1.replaceAll(" ","_");
+
+      if(hdunum == 1)
+      {
+         return cutfitsname + "_" + timestamp + "_" + tempPathname2;
+      }
+      else
+      {
+         String extnum = "EXT" + String.valueOf(hdunum-1);
+         return cutfitsname + "_" + timestamp + "_" + extnum + "_" + tempPathname2;
+      }
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/Datasets.java b/data-access/servlet/src/main/java/datasets/Datasets.java
new file mode 100644
index 0000000000000000000000000000000000000000..c07bdb955e452ce4e20c042e33bafc31ea23df58
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/Datasets.java
@@ -0,0 +1,23 @@
+
+
+import java.io.OutputStream;
+import java.io.InputStream;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+/* Jdl */
+import java.io.InputStreamReader;
+import java.io.PrintWriter;
+import java.time.Instant;//Timestamp in cut-filename
+
+public interface Datasets
+{
+
+   public CutResult doMerge(String[] idArr, Coord coord, boolean countNullValues)
+      throws FileNotFoundException, IOException;
+
+
+   public DataLink doMCutout(String jdlJson)
+      throws IOException;
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/DatasetsImpl.java b/data-access/servlet/src/main/java/datasets/DatasetsImpl.java
new file mode 100644
index 0000000000000000000000000000000000000000..7a5490643cedacce7455dbef1bfab9713bc2568f
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/DatasetsImpl.java
@@ -0,0 +1,466 @@
+
+import java.util.logging.Logger;
+import java.util.logging.Level;
+import java.util.List;
+import java.util.ArrayList;
+import java.util.Arrays;
+
+import java.time.Instant;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.io.InputStreamReader;
+import java.io.OutputStreamWriter;
+import java.io.PrintWriter;
+import java.io.BufferedReader;
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileNotFoundException;
+import java.nio.file.StandardOpenOption;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+
+import java.time.*;// Timestamp in cut-filename
+import java.io.ByteArrayOutputStream; // for SODA direct streaming doSubimgStream
+
+class DatasetsImpl implements Datasets
+{
+   static final Logger LOGGER = Logger.getLogger(DatasetsImpl.class.getName());
+
+   private Settings    settings   = null;
+   private Subsurvey[] subsurveys = null;
+
+
+   public DatasetsImpl()
+   {
+      LOGGER.info("trace DatasetsImpl()");
+      this.settings = Settings.getInstance();
+   }
+
+
+   public DatasetsImpl(Settings settings)
+   {
+      LOGGER.info("trace DatasetsImpl(settings)");
+      this.settings = settings;
+   }
+
+
+   public DatasetsImpl(Settings settings, Subsurvey[] subsurveys)
+   {
+      LOGGER.info("trace DatasetsImpl(settings, subsurveys)");
+      this.settings = settings;
+      this.subsurveys = subsurveys;
+   }
+
+
+
+
+   public CutResult doMerge(String[] idArr, Coord coord, boolean countNullValues)
+         throws FileNotFoundException, IOException
+      {
+         LOGGER.info("trace");
+
+         return merge(idArr, coord, countNullValues);
+      }
+
+
+
+   public DataLink doMCutout(String jdlJson)
+         throws IOException
+      {
+         LOGGER.info("trace");
+
+         String updatedJsonString = JdlMCutout.resolveAndUpdateJsonRequest(jdlJson, settings, subsurveys);
+
+         String outJson = doRpc( JdlMCutout.mcutoutToJson(updatedJsonString) );
+
+         return JdlMCutout.responseFromMCutoutJson(outJson);
+      }
+
+
+
+   /* ================= ALL ================================== */
+
+
+   private String doRpc(String InStr)
+   {
+      final String userName = "guest";
+      final String password = "guest";
+      // FIXME move these to Settings
+
+      RpcOverAmqp rpc = new RpcOverAmqp(
+            userName, password,
+            settings.amqpConn.hostName(),
+            settings.amqpConn.portNumber(),
+            settings.amqpConn.routingKey());
+
+      rpc.initConnectionAndReplyQueue();
+
+      String OutStr = null;
+
+      try
+      {
+         LOGGER.info("Sent request : " + InStr);
+         OutStr = rpc.callAndWaitReply(InStr);
+         LOGGER.info("Got response : " + OutStr);
+      }
+      catch  (Exception e)
+      {
+         e.printStackTrace();
+      }
+      finally
+      {
+         try
+         {
+            rpc.close();
+         }
+         catch (Exception ignore)
+         {
+            LOGGER.info("ignoring exception on rpc.close():" + ignore.getMessage());
+         }
+      }
+
+      return OutStr;
+   }
+
+
+   private  String generateSubimgPathname(String relPathname, int hdunum)
+   {
+      String cutfitsname = "vlkb-cutout";
+
+      Instant instant = Instant.now() ;
+      String timestamp = instant.toString().replace(":","-").replace(".","_");
+
+      String tempPathname1 = relPathname.replaceAll("/","-");
+      String tempPathname2 = tempPathname1.replaceAll(" ","_");
+
+      if(hdunum == 1)
+      {
+         return cutfitsname + "_" + timestamp + "_" + tempPathname2;
+      }
+      else
+      {
+         String extnum = "EXT" + String.valueOf(hdunum-1);
+         return cutfitsname + "_" + timestamp + "_" + extnum + "_" + tempPathname2;
+      }
+   }
+
+
+   /* ================= MERGE =============================== */
+
+   private CutResult cutout(
+         String publisherDid, Coord coord,
+         boolean countNullValues)
+   {
+      ResolverByObsCore rsl = new ResolverByObsCore(settings.dbConn, subsurveys);
+      rsl.resolve(publisherDid);
+
+      FitsCard[] extraCards = Subsurvey.subsurveysFindCards(subsurveys, rsl.obsCollection());//rsl.subsurveyId);
+      String absSubimgPathname = settings.fitsPaths.cutouts() + "/" + generateSubimgPathname(rsl.relPathname(), rsl.hdunum());
+      LOGGER.info("absSubimgPathname: " + absSubimgPathname);
+
+      String inJson = JsonEncoderMerge.subimgToJson(
+            rsl.relPathname(), rsl.hdunum(),
+            coord,
+            absSubimgPathname,
+            extraCards,
+            countNullValues);
+
+      return JsonDecoder.responseFromCutoutJson( doRpc(inJson) );
+   }
+
+
+   protected CutResult merge(String[] pubdids, Coord coord, Boolean countNullValues)
+   {
+      LOGGER.info("trace");
+
+      ArrayList<CutResult> allresults = new ArrayList<CutResult>();
+
+      // 1. Decode pubdid's from inputs.pubdid and cutout on each
+
+      CutResult[] allCutResults = do_cutouts(
+            pubdids, coord,
+            countNullValues);
+
+      allresults.addAll(Arrays.asList(allCutResults));
+
+
+      String[] allCutPathnames = selectCutPathnames(allCutResults);
+
+      if( allCutPathnames.length <= 0 ){
+         LOGGER.warning("No cutout created.");
+         return null;
+      }
+      if( allCutPathnames.length != pubdids.length ) {
+         LOGGER.warning("Number of cubes found and number of cutouts created do not match.");
+      }
+
+      try
+      {
+         // 2. regridding (closest neighbour interpolation) for files to be merged
+
+         Regrid grid = new Regrid();
+         int dim = grid.dimensions(allCutPathnames);
+         if( dim > 2 ) {
+            Boolean changed = grid.regrid_vel2(allCutPathnames);
+            if(changed){
+               //allresults.add("MSG Keywords CDELT3, CRVAL3 were adjusted for merge regridding.");
+               LOGGER.info("MSG Keywords CDELT3, CRVAL3 were adjusted for merge regridding.");
+            }
+         }
+
+         // 3. Merge cut-files
+
+         //String[] strar_results = mergefiles_parallel(id, logFileName,  // logfilename
+         //String[] strar_results = mergefiles_split_execution(id, logFileName,  // logfilename
+         CutResult strar_results = mergefiles(
+               String.valueOf(dim),  // prefix: "2D" or "3D"
+               allCutPathnames);     // files to merge
+
+         allresults.addAll(Arrays.asList(strar_results));
+
+      }
+      catch(Exception e)
+      {
+         LOGGER.log(Level.SEVERE, "merge:",e);
+         //allresults.add(
+         //      "MSG System error. Report time, your IP-number, and the exact request-URL string to service administrator.");
+      }
+
+      CutResult[] dlkArr = allresults.toArray(new CutResult[allresults.size()]);
+      return dlkArr[0]; // FIXME should return only datalink for the merged file not all cutout files?
+   }
+
+
+   protected CutResult[] do_cutouts(
+         String[] publisherDids, Coord coord,
+         Boolean countNullValues)
+   {
+      ArrayList<CutResult> allresults = new ArrayList<CutResult>();
+      if(publisherDids.length <= 0)
+         return null; // no cube found
+
+      for(String publisherDid : publisherDids)
+      {
+         CutResult cutout_results_table = cutout(
+               publisherDid, coord,
+               countNullValues);
+
+         allresults.addAll(Arrays.asList(cutout_results_table));
+      }
+
+      return allresults.toArray(new CutResult[allresults.size()]);
+   }
+
+
+   protected CutResult mergefiles(
+         String prefix,          // IN prefix added after filename-start-word
+         String[] filestomerge)  // IN abs path with filenames to be merged
+   {
+      LOGGER.info("trace");
+
+      String InJson = JsonEncoderMerge.mergefilesToJson( prefix, filestomerge);
+      String OutJson = doRpc(InJson);
+      return JsonDecoder.responseFromCutoutJson( OutJson );
+   }
+
+
+
+   // BEGIN parallel
+
+   protected String[] mergefiles_parallel(
+         String jobId,        // IN any identifier to be guaranteed distinct
+         String logfilename,  // IN logfilename without path
+         String prefix,          // IN prefix added after filename-start-word
+         String[] filestomerge)  // IN abs path with filenames to be merged
+   {
+      LOGGER.info("mergefiles_parallel()");
+
+      String[] responseCH = mergefiles_common_header(jobId, logfilename, prefix, filestomerge);
+      for(String sentence : responseCH) DatasetsImpl.LOGGER.info("responseCmnHdr: " + sentence);
+      // check if response errored -> abort with 500: Internal Server Error & log details
+
+      int threadsCount = filestomerge.length;
+      Thread threadArr[] = new Thread[threadsCount];
+      Reproject reprojectArr[] = new Reproject[threadsCount];
+      int i;
+      for(i=0; i<threadsCount; i++)
+         //for(String file : filestomerge)
+      {
+         String file = filestomerge[i];
+         reprojectArr[i] = new Reproject(this, jobId, prefix, file);
+         threadArr[i] = new Thread(reprojectArr[i], "reproject: " + String.valueOf(i)); 
+
+         threadArr[i].start();
+      }
+
+      // wait until all threads finished
+
+      for(i=0; i<threadsCount; i++)
+         //for(Thread thread : threadArr)
+      {
+         try
+         {
+            threadArr[i].join();
+         }
+         catch (InterruptedException e)
+         {
+            e.printStackTrace();
+         }
+
+
+         for(String sentence : reprojectArr[i].response) DatasetsImpl.LOGGER.info("response[" + String.valueOf(i) + "]: " + sentence);
+         if(!isResponseOk(reprojectArr[i].response))
+         {
+            ;// FIXME response incorrect -> abort merge-job, free resources
+             // if incorrect paarams -> respond HTTP.WRONG REQUEST
+             // if other error       -> respond HTTP.INTRNAL ERRR & log
+         }
+      }
+
+      String[] response = mergefiles_add_reprojected(jobId, prefix);
+      // check if response errored -> abort with 500: Internal Server Error & log details
+
+      return response;
+   }
+
+   private boolean isResponseOk(String[] response)
+   {
+      // FIXME implement!
+      return true;
+   }
+
+
+
+
+
+   protected String[] mergefiles_split_execution(
+         String jobId,        // IN any identifier to be guaranteed distinct
+         String logfilename,  // IN logfilename without path
+         String prefix,          // IN prefix added after filename-start-word
+         String[] filestomerge)  // IN abs path with filenames to be merged
+   {
+      LOGGER.info("mergefiles_split_execution()");
+
+      String[] responseCH = mergefiles_common_header(jobId, logfilename, prefix, filestomerge);
+      // check if response errored -> abort with 500: Internal Server Error & log details
+
+      for(String file : filestomerge)// FIXME parallelize on threads & join
+      {
+         String[] response = mergefiles_reproject(jobId, prefix, file);
+         // check if response errored -> abort with: 500: Internal Server Error & log details
+      }
+
+      String[] response = mergefiles_add_reprojected(jobId, prefix);
+      // check if response errored -> abort with 500: Internal Server Error & log details
+
+      return response;
+   }
+
+   protected String[] mergefiles_common_header(
+         String jobId,     // IN jobId to distinguish parallel executed requests
+         String logfilename,     // IN logfilename without path
+         String prefix,          // IN prefix added after filename-start-word
+         String[] filestomerge)  // IN abs path with filenames to be merged
+   {
+      LOGGER.info("trace");
+
+      String InJson = JsonEncoderMerge.mergefilesCommonHeaderToJson(jobId, prefix, filestomerge);
+      String OutJson = doRpc(InJson);
+      String[] results = null; // FIXME JsonDecoder.responseFromJson(OutJson);
+
+      return results;
+   }
+
+
+   protected String[] mergefiles_reproject(
+         String jobId,     // IN jobId to distinguish parallel executed requests
+         String prefix,          // IN prefix added after filename-start-word
+         String fitsfilename)    // IN logfilename without path
+   {
+      LOGGER.info("trace");
+
+      String InJson = JsonEncoderMerge.mergefilesReprojectToJson(jobId, prefix, fitsfilename);
+      String OutJson = doRpc(InJson);
+      String[] results = null; // FIXME JsonDecoder.responseFromJson(OutJson);
+
+      return results;
+   }
+
+
+   protected String[] mergefiles_add_reprojected(
+         String jobId,     // IN jobId to distinguish parallel executed requests
+         String prefix)          // IN prefix added after filename-start-word
+   {
+      LOGGER.info("trace");
+
+      String InJson = JsonEncoderMerge.mergefilesAddReprojectedToJson(jobId, prefix);
+      String OutJson = doRpc(InJson);
+      String[] results = null; // FIXME JsonDecoder.responseFromJson(OutJson);
+
+      return results;
+   }
+
+   // END parallel
+
+
+
+
+
+   // returns selected data in list of strings:
+   // -- from cutout: the cutout filename (server local)
+   private String[] selectCutPathnames(CutResult[] results) {
+
+      LOGGER.info("selectCutPathnames()");
+
+      // return only data (without MSG's LOG's etc)
+      ArrayList<String> data = new ArrayList<String>();
+
+      // sanity check - move after doFunc call (here covered by exception)
+      // FIXME consider catch null-pointer-exception
+      if(results == null) {
+         LOGGER.info("selectCutPathnames: results-table is null.");
+         return null;
+      }
+
+      for (CutResult res : results) {
+
+         /*/ protect substring() calls below;
+         // FIXME consider catch exception index-out-of-bounds
+         if(res.length() < 3) {
+         LOGGER.warning("Assert(Results.toXML): results msg shorter then 3 chars : " + res);
+         continue;
+         }
+
+         // decode msg type
+         switch(res.substring(0,3)){
+         case "URL": // from cutout: the cutout filename for download
+         String localfname = res.substring(4);//.replaceAll(FITScutpath, "");
+         String[] ssfn = localfname.split(":");
+         //String[] ssfn = localfname.substring(4).split(":");
+         LOGGER.info("ssfn[0]: " + ssfn[0]);
+         LOGGER.info("ssfn[1]: " + ssfn[1]);
+         data.add(ssfn[1]);
+         //data.add(localfname);
+         break;
+         case "NVS": // from cutout : NVS_nn:nn:nn
+         case "MSG":
+         case "LOG":
+         case "CUT": // from cutout: the file which was cut
+         case "HID": // from search
+                     // no data, do nothing
+                     break;
+                     default:
+                     LOGGER.severe("Assert(Results.toXML): results msg has unhandled msgtype code : " + res);
+                     }*/
+         data.add(res.filename);
+      }
+
+      return data.toArray(new String[data.size()]);
+   }
+
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/ExecCmd.java b/data-access/servlet/src/main/java/datasets/ExecCmd.java
new file mode 100644
index 0000000000000000000000000000000000000000..8255b272ee0f6a91477ee11b5e44094c7184d031
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/ExecCmd.java
@@ -0,0 +1,99 @@
+
+import java.util.logging.Logger;
+import java.util.*;
+import java.io.*;
+
+
+class StreamGobbler extends Thread
+{
+   public static final Logger LOGGER = Logger.getLogger("StreamGobbler");
+
+   InputStream is;
+   String type;
+   OutputStream os;
+
+   StreamGobbler(InputStream is, String type)
+   {
+      this(is, type, null);
+   }
+
+   StreamGobbler(InputStream is, String type, OutputStream redirect)
+   {
+      this.is = is;
+      this.type = type;
+      this.os = redirect;
+   }
+
+
+   public void run()
+   {
+      try
+      {
+         BufferedOutputStream bos = null;
+         if(os != null)
+            bos = new BufferedOutputStream(os);
+
+         BufferedInputStream bis = new BufferedInputStream(is);
+
+         byte[] buffer = new byte[2048];
+         int nread = 0;
+         while ( (nread = bis.read(buffer)) != -1)
+         {
+            if (bos != null)
+            {
+               bos.write(buffer, 0, nread);
+            }
+            else
+            {
+               LOGGER.info(type + ">" + new String(buffer, 0, nread, "utf-8"));
+            }
+         }
+
+         if(bos !=null)
+            bos.flush();
+
+      } catch (IOException ioe)
+      {
+         ioe.printStackTrace();  
+      }
+   }
+}
+
+
+class ExecCmd
+{
+   public static final Logger LOGGER = Logger.getLogger("ExecCmd");
+
+   public int exitValue;
+
+   public void doRun(OutputStream outputStream, String[] cmd)
+      throws IOException, InterruptedException
+   {
+      // Assert outputStream != null
+
+      LOGGER.info("CMD: " + Arrays.toString(cmd));
+
+      Runtime rt = Runtime.getRuntime();
+      Process proc = rt.exec(cmd);
+
+      // any error message?
+      StreamGobbler errorGobbler = new StreamGobbler(proc.getErrorStream(), "ERROR");
+
+      // any output?
+      StreamGobbler outputGobbler = new StreamGobbler(proc.getInputStream(), "OUTPUT", outputStream);
+
+      // kick them off
+      outputGobbler.start();
+      errorGobbler.start();
+
+      outputGobbler.join();
+      errorGobbler.join();
+
+      exitValue = proc.waitFor();
+
+      outputStream.flush();
+   }
+
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/Regrid.java b/data-access/servlet/src/main/java/datasets/Regrid.java
new file mode 100644
index 0000000000000000000000000000000000000000..148c377b719a176dd4efd800d7aff6e48f06eeb8
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/Regrid.java
@@ -0,0 +1,254 @@
+//
+// Access to service-engines (abstract methods) plus common functionality (merge):
+// - search, cutout, mergefiles : abstract webservices funcs (overrride for AMQP, JNI)
+// - merge : high level, which uses mergefiles after cutout
+
+import java.util.logging.Logger;
+import java.util.logging.Level;
+
+import java.io.IOException;
+import java.util.*; // ArrayList<String>
+import java.io.File;
+import nom.tam.fits.*;// Fits - for regridding
+
+
+
+class Regrid
+{
+   private static final Logger LOGGER = Logger.getLogger(Regrid.class.getName());
+
+   private Double average(ArrayList<Double> dd) {
+
+      Double sum = 0.0;
+      int sz = dd.size();
+
+      for (int i=0; i < sz; i++) {
+         sum += dd.get(i);
+      }
+      return sum / sz;
+   }
+
+
+   // returns number of non-degenerate axes in fitsfiles
+   // this is utility for merge, where files must be homogenious,
+   // so number of dimensions must be the same in
+   // all files otherwise merge not possible
+   //
+   // Counts only non-degenerate axes (NAXISi >1).
+   //
+   // in:  fitsfiles : list of filenames with full path !?
+   // out: int       : number of (non-degenerate) dimensions common to all fitsfiles
+   public int dimensions(final String[] fitsfiles)
+      //throws FitsException, IOException
+   {
+      int dim = 0;
+
+      ArrayList<Integer>   Vnaxis = new ArrayList<Integer>();
+
+      for(String fitsfile : fitsfiles) {
+
+         // check if empty filename (;; typed on input)
+         if(fitsfile.length() == 0)
+            continue;
+
+         try {
+
+            Fits f = new Fits(fitsfile);
+            Integer naxis = f.getHDU(0).getHeader().getIntValue("NAXIS");
+
+            Integer nx = 0; // naxis excluding degenerate axes
+            for(int i=1; i<=naxis; i++){
+               if( 1 < f.getHDU(0).getHeader().getIntValue("NAXIS" + i))
+                  nx++;
+            }
+            Vnaxis.add(nx);
+
+         } catch (FitsException e) {
+            LOGGER.log(Level.SEVERE, "dimensions:",e);
+         } catch (IOException e) {
+            LOGGER.log(Level.SEVERE, "dimensions:",e);
+         }
+      }
+      // check that dimensions in all files match
+      dim = Vnaxis.get(0);
+      for( int ix = 1; ix < Vnaxis.size() ; ix++ ) {
+         if (Vnaxis.get(ix) != dim) {
+            // FIXME throw exception
+            dim = 0;
+         };
+      }
+      return dim;
+   }
+
+   // regrid variant 2:
+   // CRPIX can be different in files to be merged -> we'll adjust shift CRPIX for common CRVAL
+   // attempt regrid velocity if needed and possible
+   // returns true: add msg to client: "Keywords CDLET3, CRVAL3
+   //               were modified for merge, see header of the merged file."
+   // returns false: do nothing
+   //
+   // use closest neighbour algorithm - changes only metadata
+   // possible: if CDELT3 does not differ more then one pixel (=size CDELT3)
+   //           between the files to be merged and
+   //           NAXIS3 is exactly the same (in cutouts should be)
+   // needed:   if CDELT3 & CRVAL3 are not (exactly) equal between the files to be merged
+   //
+   // WARN: assumes (as Montage does too) 3rd axis is velocity (compatible)
+   public Boolean regrid_vel2(final String[] fitsfiles)
+      //   Boolean regrid_vel2(final ArrayList<String> fitsfiles)
+      //throws FitsException, IOException
+   {
+      ArrayList<Double> Vcrval = new ArrayList<Double>();
+      ArrayList<Double> Vcrpix = new ArrayList<Double>();
+      ArrayList<Double> Vcdelt = new ArrayList<Double>();
+      ArrayList<Long>   Vnaxis = new ArrayList<Long>();
+
+      //
+      // 1, read needed keywords from files to be merged
+      //
+      for(String fitsfile : fitsfiles) {
+
+         // check if empty filename (;; typed on input)
+         if(fitsfile.length() == 0)
+            continue;
+
+         // read orig keyword values
+
+         try {
+
+            // FITS
+            Fits f = new Fits(fitsfile);
+
+            // we should check here that 3rd axis is velocity-compatible (freq etc)
+            // for now, remains only assumption (also Montage only assumes and offers mTransform if it is not so)
+            // String oldkwd = f.getHDU(0).getHeader().getStringValue("CTYPE3");
+
+            // get card values as string (for exact match comparison to avoid string -> double conversion artifacts)
+            // String allcardstr   = f.getHDU(0).getHeader().findKey("CRVAL3");
+            // String cardvaluestr = f.getHDU(0).getHeader().findCard("CRVAL3").getValue();
+            // LOGGER.info("CRVAL3 as string: " + cardvaluestr);
+
+            Vcrval.add(f.getHDU(0).getHeader().getDoubleValue("CRVAL3"));
+            Vcrpix.add(f.getHDU(0).getHeader().getDoubleValue("CRPIX3"));
+            Vcdelt.add(f.getHDU(0).getHeader().getDoubleValue("CDELT3"));
+            Vnaxis.add(f.getHDU(0).getHeader().getLongValue("NAXIS3"));
+
+
+         } catch (FitsException e) {
+            LOGGER.log(Level.SEVERE, "regrid_vel2:",e);
+         } catch (IOException e) {
+            LOGGER.log(Level.SEVERE, "regrid_vel2:",e);
+         }
+
+      }
+      /*/ debug print
+        for( int ix = 0; ix < Vcrval.size() ; ix++ ) {
+
+        LOGGER.info(ix +
+        " " + Vcrval.get(ix) +
+        " " + Vcdelt.get(ix) +
+        " " + Vcrpix.get(ix) +
+        " " + Vnaxis.get(ix)
+        );
+
+        }
+        */
+
+      //
+      // 2, check if closeste-neighbour interpolation possible/needed:
+      // NAXIS3 must match
+      // max diff(CDELT3) << absvalue(CDELT3)
+      // max diff(CRVAL3) << absvalue(CDELT3)
+      //
+      long dnaxis = Collections.max(Vnaxis) - Collections.min(Vnaxis);
+      //LOGGER.info("dNAXIS : " + dnaxis);
+      if( dnaxis != 0 ) {
+         return false;
+      }
+      double minCDELT = Collections.min(Vcdelt);
+      double maxCDELT = Collections.max(Vcdelt);
+
+      double avgCDELT    = average(Vcdelt);
+      double absavgCDELT = java.lang.Math.abs(avgCDELT);
+
+      // FIXME use exceptions instead...
+      if(absavgCDELT == 0.0) {
+         LOGGER.warning("regrid: avg(CDELT3) == 0");
+         return false;
+      }
+
+      double dcdelt = java.lang.Math.abs(maxCDELT - minCDELT);
+      //LOGGER.info("dCDELT : " + dcdelt
+      //                   + " ratio: " +
+      //                   String.format("%.1f",100.0*dcdelt/absavgCDELT)
+      //                   + " %" );
+      if(dcdelt > absavgCDELT) {
+         return false;
+      }
+
+      double minCRVAL = Collections.min(Vcrval);
+      double maxCRVAL = Collections.max(Vcrval);
+      double dcrval = java.lang.Math.abs(maxCRVAL - minCRVAL);
+      //LOGGER.info("dCRVAL : " + dcrval + "|CDELT| : " + absavgCDELT
+      //                   + " ratio: " +
+      //                   String.format("%.1f",100.0*dcrval/absavgCDELT)
+      //                   + " %" );
+      //            if(dcrval > absavgCDELT) {
+      //                return false;
+      //            }
+      // if we've got here all conditions for interpolation satisfied
+
+      // exact match, interpolation not needed
+      // ?? FIXME would be better check exact match by comparing card values as strings ??
+      // to avoid string -> double conversion machine architecture dependencies (any?)
+      if((dcrval == 0.0) && (dcdelt == 0.0)){
+         return false;
+      }
+
+      //
+      // 3, interpolation possible and needed: update fits file headers
+      //    with new values
+      //
+
+      // interpolate closest neighbour: simply set the grid to average of all
+      // they differ by less then a pixel (=CDELT3)
+      double newCDELT = avgCDELT;        
+      double newCRVAL = average(Vcrval);
+
+      for(String fitsfile : fitsfiles) {
+
+         // check if empty filename (;; typed on input)
+         if(fitsfile.length() == 0)
+            continue;
+
+         try {
+
+            Fits f = new Fits(fitsfile);
+
+            double origCDELT = f.getHDU(0).getHeader().getDoubleValue("CDELT3");
+            String commentCDELT = "VLKB OrigVal: " + origCDELT;
+            f.getHDU(0).getHeader().addValue("CDELT3",newCDELT, commentCDELT);
+
+            double origCRVAL = f.getHDU(0).getHeader().getDoubleValue("CRVAL3");
+            String commentCRVAL = "VLKB OrigVal: " + origCRVAL;
+            f.getHDU(0).getHeader().addValue("CRVAL3",newCRVAL,commentCRVAL);
+
+            double origCRPIX = f.getHDU(0).getHeader().getDoubleValue("CRPIX3");
+            String commentCRPIX = "VLKB OrigVal: " + origCRPIX;
+            double newCRPIX = origCRPIX - ((newCRVAL - origCRVAL) / newCDELT);
+            f.getHDU(0).getHeader().addValue("CRPIX3",newCRPIX,commentCRPIX);
+
+            f.getHDU(0).rewrite();
+
+         } catch(Exception e) {
+            // FIXME do error handling properly...
+            LOGGER.log(Level.SEVERE, "regrid_vel2:", e);
+         }
+
+      }
+
+      return true;
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/Reproject.java b/data-access/servlet/src/main/java/datasets/Reproject.java
new file mode 100644
index 0000000000000000000000000000000000000000..d2505d99559a4676648609330699189fb75b846d
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/Reproject.java
@@ -0,0 +1,32 @@
+
+
+
+
+class Reproject implements Runnable
+{
+   String id;
+   String prefix;
+   String fileName;
+   String[] response;
+   DatasetsImpl datasets;
+
+   public Reproject(DatasetsImpl datasets, String id, String prefix, String fileName)
+   {
+      this.datasets  = datasets;
+      this.id        = id;
+      this.prefix    = prefix;
+      this.fileName  = fileName;
+   }
+
+
+   @Override
+   public void run()
+   {
+      String name = Thread.currentThread().getName();
+      DatasetsImpl.LOGGER.info("Start of " + name);
+      response = datasets.mergefiles_reproject(id, prefix, fileName);
+      DatasetsImpl.LOGGER.info("End   of " + name);
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/json-rpc/JdlMCutout.java b/data-access/servlet/src/main/java/datasets/json-rpc/JdlMCutout.java
new file mode 100644
index 0000000000000000000000000000000000000000..c076dd1e4b08c55d9a77835566cd8930038817b7
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/json-rpc/JdlMCutout.java
@@ -0,0 +1,206 @@
+
+import java.util.Iterator;
+/* 'JSON-Simple' library */
+import org.json.simple.JSONArray;
+import org.json.simple.JSONObject;
+import org.json.simple.parser.JSONParser;
+import org.json.simple.parser.ParseException;
+
+
+public class JdlMCutout
+{
+
+
+
+   /* used in mcutout to resolve pubdids to pathanme+hdunum */
+   public static String resolveAndUpdateJsonRequest(String reqJsonString, Settings settings, Subsurvey[] subsurveys)
+   {
+      String str = new String();
+
+      JSONParser parser = new JSONParser();
+      try
+      {
+         JSONArray a = (JSONArray) parser.parse(reqJsonString);
+
+         for (Object o : a)
+         {
+            JSONObject cutParams = (JSONObject) o;
+
+            String publisherDid = (String) cutParams.get("pubdid");
+
+            /* resolve pubdid -> pathname hdunum extraCards */
+
+            FitsCard[] extraCards = null;
+
+            ResolverByObsCore rsl = new ResolverByObsCore(settings.dbConn, subsurveys);
+            rsl.resolve(publisherDid);
+
+            extraCards = Subsurvey.subsurveysFindCards(subsurveys, rsl.obsCollection());//subsurveyId);
+
+            /* add resolved info to json */
+
+            cutParams.put("filename",rsl.relPathname());
+            cutParams.put("hdunum",rsl.hdunum());
+
+            if(extraCards != null)
+            {
+               if(extraCards.length > 0)
+               {
+                  JSONArray jcards = new JSONArray();
+                  for(FitsCard card : extraCards)
+                  {
+                     JSONObject jcard = new JSONObject();
+                     jcard.put("key", card.key);
+                     jcard.put("value", card.value);
+                     jcard.put("comment", card.comment);
+                     jcards.add(jcard);
+                  }
+                  cutParams.put("extra_cards", jcards);
+               }
+            }
+         }
+
+         str = a.toString();
+      }
+      catch(ParseException e)
+      {
+         e.printStackTrace();
+      }
+
+      return str;
+   }
+
+
+
+   @SuppressWarnings("unchecked")
+   public static String mcutoutToJson(String reqJsonString)
+   // FIXME this should do: throws ParsingError
+   {
+      JSONArray arrCuts;
+      JSONParser parserCuts = new JSONParser();
+      try
+      {
+         arrCuts = (JSONArray) parserCuts.parse(reqJsonString);
+      }
+      catch(ParseException pe)
+      {
+         System.out.println("mcutout json from HTTP-body incorrect");
+         return "";
+      }
+
+      JSONObject objParameters = new JSONObject();
+      objParameters.put("cuts", arrCuts);
+
+      JSONObject obj = new JSONObject();
+      obj.put("service",  "MCUTOUT");
+      obj.put("parameters",  objParameters);
+
+      return obj.toJSONString();
+   }
+
+
+
+
+   public static DataLink responseFromMCutoutJson(String response)
+      // throws ParseException
+   {   
+      DataLink dlk = new DataLink();
+
+      try {
+         JSONParser parser = new JSONParser();
+         Object jsonObj = parser.parse(response);
+         JSONObject jsonObject  = (JSONObject) jsonObj;
+
+         JSONObject jexcept = (JSONObject) jsonObject.get("exception");
+         if(jexcept != null)
+         {
+            String type = (String)jexcept.get("type");
+            String msg  = (String)jexcept.get("msg");
+            if(type.equals("INVALID_PARAM"))
+            {
+               throw new IllegalArgumentException(msg);
+            }
+            else if(type.equals("SYSTEM_ERROR"))
+            {
+               throw new IllegalStateException("Internal system error.");
+            }
+         }
+         else
+         {
+            long fileSize = (long) jsonObject.get("filesize");
+            String fileName = (String) jsonObject.get("filename");
+
+            JSONArray jsonArray = (JSONArray)jsonObject.get("responses");
+
+            MCutResult[] mcutResArr = new MCutResult[jsonArray.size()];
+
+            int i = 0;
+            @SuppressWarnings("unchecked")
+            Iterator<JSONObject> itr = jsonArray.iterator();
+            while (itr.hasNext())
+            {
+               JSONObject jObj = itr.next();
+
+               mcutResArr[i].inputs  = (Inputs)jObj.get("input");
+               String ctype = (String)jObj.get("type");
+               if(ctype.equals("FILENAME"))
+                  mcutResArr[i].contentType = MCutResult.ContentType.FILENAME;
+               else if(ctype.equals("BAD_REQUEST"))
+                  mcutResArr[i].contentType = MCutResult.ContentType.BAD_REQUEST;
+               else if(ctype.equals("SERVICE_ERROR"))
+                  mcutResArr[i].contentType = MCutResult.ContentType.SERVICE_ERROR;
+               mcutResArr[i].content = (String)jObj.get("content");
+               i++;
+            }
+
+            dlk.contentLength = fileSize;
+            dlk.accessUrl     = fileName;
+            dlk.mcutResultArr = mcutResArr;
+         }
+      }   
+      catch  (ParseException e)
+      {   
+         e.printStackTrace();
+         throw new IllegalStateException("Internal system error.");
+      }   
+      return dlk;
+   }
+
+
+
+
+
+   public static String[] pubdidsFromReqJson(String reqJsonString)
+      // throws ParseException
+   {
+      String[] pubdids = null;
+
+      try
+      {
+         JSONParser parser = new JSONParser();
+         JSONArray jsonArray = (JSONArray)parser.parse(reqJsonString);
+
+         pubdids = new String[jsonArray.size()];
+
+         int i = 0;
+         @SuppressWarnings("unchecked")
+         Iterator<JSONObject> itr = jsonArray.iterator();
+         while (itr.hasNext())
+         {
+            JSONObject jObj = itr.next();
+            pubdids[i] = (String)jObj.get("pubdid");
+            i++;
+         }
+      }
+      catch  (Exception e)
+      { // FIXME ParseException and others
+         e.printStackTrace();
+      }
+      return pubdids;
+   }
+
+
+
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/json-rpc/JsonDecoder.java b/data-access/servlet/src/main/java/datasets/json-rpc/JsonDecoder.java
new file mode 100644
index 0000000000000000000000000000000000000000..a3b9ffc3abf19e876d5e2e8a7e0b8518568da393
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/json-rpc/JsonDecoder.java
@@ -0,0 +1,88 @@
+
+import java.util.logging.Logger;
+import java.util.logging.Level;
+import java.util.Iterator;
+/* 'JSON-Simple' library */
+import org.json.simple.JSONArray;
+import org.json.simple.JSONObject;
+import org.json.simple.parser.JSONParser;
+import org.json.simple.parser.ParseException;
+
+
+
+// Engine --> Servlet
+//
+// 1. exception (decoded inside 2 3)
+//
+// 2. response from cutout:   struct cutout_res_s  --> DataLink
+//    .filesize
+//    .filename
+//    .nullval_count : {fill_ratio null_count total_count}
+//
+// NOTE: MCutout moved to -> dacc/JdlMCutout.java
+// 3. response from mcutout:  struct mcutout_res_s --> DataLink
+//    .filesize
+//    .tgz_filename
+//    .cut_resp_s[] : {cut_param_s content_type content}
+//
+//    cut_param_s : {pubdid, coordonates, bool-countNullVals, filename, hdunum, cards[]}
+
+
+
+public class JsonDecoder
+{
+ static final Logger LOGGER = Logger.getLogger(DatasetsImpl.class.getName());
+
+   public static CutResult responseFromCutoutJson(String response)
+      // throws ParseException
+   {
+      CutResult cut = new CutResult();
+
+      try
+      {
+         JSONParser parser = new JSONParser();
+         Object jsonObj = parser.parse(response);
+         JSONObject jsonObject  = (JSONObject) jsonObj;
+
+         JSONObject jexcept = (JSONObject) jsonObject.get("exception");
+         if(jexcept != null)
+         {
+            String type = (String)jexcept.get("type");
+            String msg  = (String)jexcept.get("msg");
+            if(type.equals("INVALID_PARAM"))
+            {
+               throw new IllegalArgumentException(msg);
+            }
+            else if(type.equals("SYSTEM_ERROR"))
+            {
+               throw new IllegalStateException("Internal system error.");
+            }
+         }
+         else
+         {
+            long fileSize   = (long) jsonObject.get("filesize");
+            String fileName = (String) jsonObject.get("filename");
+
+            JSONObject jnvc  = (JSONObject)jsonObject.get("nullvals_count");
+            double fillRatio = (double) jnvc.get("fillratio");
+            long null_count  = (long) jnvc.get("nullcount");
+            long total_count = (long) jnvc.get("totalcount");
+
+            cut.filesize = fileSize;
+            cut.filename = fileName;
+            cut.nullValueCount.percent    = fillRatio;
+            cut.nullValueCount.nullCount  = null_count;
+            cut.nullValueCount.totalCount = total_count;
+         }
+      }
+      catch  (ParseException e)
+      {
+         LOGGER.info(e.getMessage());
+         e.printStackTrace();
+         throw new IllegalStateException("Internal system error.");
+      }
+
+      return cut;
+   }
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/json-rpc/JsonEncoder.java b/data-access/servlet/src/main/java/datasets/json-rpc/JsonEncoder.java
new file mode 100644
index 0000000000000000000000000000000000000000..c76e0204766ab0381b81da81b1f71cc05a8b795f
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/json-rpc/JsonEncoder.java
@@ -0,0 +1,170 @@
+
+import java.util.logging.Logger;
+import java.util.logging.Level;
+import java.util.Iterator;
+/* 'JSON-Simple' library */
+import org.json.simple.JSONArray;
+import org.json.simple.JSONObject;
+import org.json.simple.parser.JSONParser;
+import org.json.simple.parser.ParseException;
+
+
+public class JsonEncoder
+{
+   static final Logger LOGGER = Logger.getLogger("JsonEncoder");
+
+   private JSONObject obj;
+
+   public JsonEncoder() { this.obj = new JSONObject(); this.obj.put("service","SUBIMG"); }
+
+   public String toString() { return this.obj.toString(); }
+
+
+   public void add(String pathname, int hdunum)
+   {
+      LOGGER.info("trace" + pathname);
+
+      this.obj.put("img_pathname", pathname);
+      this.obj.put("img_hdunum",   hdunum);
+   }
+
+
+   // NOTE: implementation of NULL: do not put into json
+   // (alternatively could put JSON.NULL as "pos" value or use pos.system.NONE)
+
+   public void add(Pos pos)
+   {
+      if(pos != null)
+      {
+         JSONObject j = new JSONObject();
+         j.put("system", pos.system.toString());
+
+         if(pos.circle  != null) j.put("circle",  objJCircle(pos.circle));
+         if(pos.range   != null) j.put("range",   objJRange(pos.range));
+         if(pos.polygon != null) j.put("polygon", objJPolygon(pos.polygon));
+
+         this.obj.put("pos", j);
+      }
+   }
+
+
+   public void add(Band band)
+   {
+      if(band != null)
+      {
+         JSONObject j = new JSONObject();
+         j.put("system", band.system.toString());
+
+         JSONArray arr = new JSONArray();
+         for(double dbl : band.wavelength) arr.add(dbl);
+         j.put("interval",arr);
+
+         this.obj.put("band", j);
+      }
+   }
+
+
+   public void add(Time time)
+   {
+      if(time != null)
+      {
+         JSONObject j = new JSONObject();
+         j.put("system", time.system.toString());
+
+         JSONArray arr = new JSONArray();
+         for(double dbl : time.mjdUtc) arr.add(dbl);
+         j.put("interval",arr);
+
+         this.obj.put("time", j);
+      }
+   }
+
+
+   public void add(Pol pol)
+   {
+      if(pol != null)
+      {
+         JSONArray arr = new JSONArray();
+         for(String val : pol.states) arr.add(val);
+         this.obj.put("pol",arr);
+      }
+   }
+
+
+   public void add(FitsCard[] extraCards)
+   {
+      if((extraCards != null) && (extraCards.length > 0)) 
+      {
+         obj.put("extra_cards", extraCardsToJson(extraCards));
+      }
+   }
+
+
+   public void add(boolean countNullValues)
+   {
+      obj.put("count_null_values", countNullValues);
+   }
+
+
+
+   private JSONArray extraCardsToJson(FitsCard[] extraCards)
+   {
+      JSONArray jcards = new JSONArray();
+      for(FitsCard card : extraCards)
+      {
+         JSONObject j = new JSONObject();
+         j.put("key",     card.key);
+         j.put("value",   card.value);
+         j.put("comment", card.comment);
+         jcards.add(j);
+      }
+      return jcards;
+   }
+
+
+   private JSONObject objJCircle(Circle circle)
+   {
+      JSONObject obj = new JSONObject();
+      obj.put("lon", circle.lon);
+      obj.put("lat", circle.lat);
+      obj.put("radius", circle.radius);
+      return obj;
+   }
+
+
+   private JSONObject objJRange(Range range)
+   {
+      JSONObject obj = new JSONObject();
+      obj.put("lon1", range.lon1);
+      obj.put("lon2", range.lon2);
+      obj.put("lat1", range.lat1);
+      obj.put("lat2", range.lat2);
+      return obj;
+   }
+
+   private JSONObject objJPolygon(Polygon poly)
+   {
+      JSONObject obj = new JSONObject();
+      obj.put("lon", genPolyLonJsonArr(poly));
+      obj.put("lat", genPolyLatJsonArr(poly));
+      return obj;
+   }
+
+
+   private JSONArray genPolyLonJsonArr(Polygon polygon)
+   {
+      JSONArray jarr = new JSONArray();
+      for(double dbl : polygon.lon) jarr.add(dbl);
+      return jarr;
+   }
+
+
+   private JSONArray genPolyLatJsonArr(Polygon polygon)
+   {
+      JSONArray jarr = new JSONArray();
+      for(double dbl : polygon.lat) jarr.add(dbl);
+      return jarr;
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/json-rpc/JsonEncoderMerge.java b/data-access/servlet/src/main/java/datasets/json-rpc/JsonEncoderMerge.java
new file mode 100644
index 0000000000000000000000000000000000000000..96151188c1d832acd331d9c8ebe63e0340839114
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/json-rpc/JsonEncoderMerge.java
@@ -0,0 +1,260 @@
+
+import java.util.Iterator;
+/* 'JSON-Simple' library */
+import org.json.simple.JSONArray;
+import org.json.simple.JSONObject;
+import org.json.simple.parser.JSONParser;
+import org.json.simple.parser.ParseException;
+
+
+public class JsonEncoderMerge
+{
+
+
+   @SuppressWarnings("unchecked")
+   private static JSONObject coordinatesToJsonObj(Coord coord)
+   {
+      JSONObject obj = new JSONObject();
+
+      obj.put("skysystem",  coord.skySystem);
+      obj.put("shape",      coord.pos.shape);
+      obj.put("specsystem", coord.specSystem);
+
+      /* SODA */
+
+      if(coord.pos != null)
+      {
+         obj.put("pos", objJPos(coord.pos));
+      }
+
+      if(coord.band != null)
+      {
+         obj.put("band", arrJBand(coord.band));
+      }
+
+      if(coord.time != null)
+      {
+         obj.put("time", genTimeJsonArr(coord.time) );
+      }
+
+      if(coord.pol != null)
+      {
+         obj.put("pol", genPolJsonArr(coord.pol) );
+      }
+
+      return obj;
+   }
+
+
+   private static JSONObject objJCircle(Circle circle)
+   {
+      JSONObject obj = new JSONObject();
+      obj.put("lon", circle.lon);
+      obj.put("lat", circle.lat);
+      obj.put("radius", circle.radius);
+      return obj;
+   }
+
+   private static JSONObject objJRange(Range range)
+   {
+      JSONObject obj = new JSONObject();
+      obj.put("lon1", range.lon1);
+      obj.put("lon2", range.lon2);
+      obj.put("lat1", range.lat1);
+      obj.put("lat2", range.lat2);
+      return obj;
+   }
+
+   private static JSONObject objJPolygon(Polygon poly)
+   {
+      JSONObject obj = new JSONObject();
+      obj.put("lon", genPolyLonJsonArr(poly));
+      obj.put("lat", genPolyLatJsonArr(poly));
+      return obj;
+   }
+
+   private static JSONObject objJPos(Pos pos)
+   {
+      JSONObject obj = new JSONObject();
+      if(pos.circle  != null) obj.put("circle",  objJCircle(pos.circle));
+      if(pos.range   != null) obj.put("range",   objJRange(pos.range));
+      if(pos.polygon != null) obj.put("polygon", objJPolygon(pos.polygon));
+      return obj;
+   }
+
+   private static JSONArray arrJBand(Band band)
+   {
+      JSONArray arr = new JSONArray();
+      for(double dbl : band.wavelength) arr.add(dbl);
+      return arr;
+   }
+
+   private static JSONArray genTimeJsonArr(Time time)
+   {
+      JSONArray jarr = new JSONArray();
+      for(double dbl : time.mjdUtc) jarr.add(dbl);
+      return jarr;
+   }
+
+   private static JSONArray genPolyLonJsonArr(Polygon polygon)
+   {
+      JSONArray jarr = new JSONArray();
+      for(double dbl : polygon.lon) jarr.add(dbl);
+      return jarr;
+   }
+   private static JSONArray genPolyLatJsonArr(Polygon polygon)
+   {
+      JSONArray jarr = new JSONArray();
+      for(double dbl : polygon.lat) jarr.add(dbl);
+      return jarr;
+   }
+
+
+
+   private static JSONArray genPolJsonArr(Pol pol)
+   {
+      JSONArray jarr = new JSONArray();
+      for(String str : pol.states) jarr.add(str);
+      return jarr;
+   }
+
+
+   private static JSONArray extraCardsToJson(FitsCard[] extraCards)
+   {
+      JSONArray jcards = new JSONArray();
+      for(FitsCard card : extraCards)
+      {
+         //jcards.add(card); FIXME check what would this add; compiler did not complain
+
+         JSONObject jcard = new JSONObject();
+         jcard.put("key", card.key);
+         jcard.put("value", card.value);
+         jcard.put("comment", card.comment);
+
+         jcards.add(jcard);
+      }
+      return jcards;
+   }
+
+
+
+
+
+   @SuppressWarnings("unchecked")
+   public static String subimgToJson(
+         String imgPathname,
+         int imgHdunum,
+         Coord coord,
+         String subimgFilename,
+         FitsCard[] extraCards,
+         boolean countNullValues)
+   {
+      JSONObject obj = new JSONObject();
+
+      obj.put("service",  "SUBIMG");
+
+      obj.put("img_pathname",    imgPathname);
+      obj.put("img_hdunum",      imgHdunum);
+      obj.put("coordinates",     coordinatesToJsonObj(coord));
+      obj.put("subimg_filename", subimgFilename);
+
+      if((extraCards != null) && (extraCards.length > 0))
+      {
+         obj.put("extra_cards", extraCardsToJson(extraCards));
+      }
+
+      obj.put("count_null_values", countNullValues);
+
+      return obj.toJSONString();
+   }
+
+
+
+   @SuppressWarnings("unchecked")
+   public static String mergefilesToJson(
+         String dimensionality, 
+         String[] filestomerge )
+   {
+      JSONObject objParameters = new JSONObject();
+      objParameters.put("dimensionality", dimensionality);
+
+      JSONArray fnames = new JSONArray();
+      for(String fn : filestomerge){
+         fnames.add(fn);
+      }
+      objParameters.put("files_to_merge", fnames);
+
+      JSONObject obj = new JSONObject();
+      obj.put("service",  "MERGEF");
+      obj.put("parameters",  objParameters);
+
+      return obj.toJSONString();
+   }
+
+
+
+   // BEGIN merge-parallel
+
+   @SuppressWarnings("unchecked")
+   public static String mergefilesCommonHeaderToJson(
+         String jobId,
+         String dimensionality, 
+         String[] filestomerge )
+   {
+      JSONObject objParameters = new JSONObject();
+      objParameters.put("merge_id", jobId);
+      objParameters.put("dimensionality", dimensionality);
+
+      JSONArray fnames = new JSONArray();
+      for(String fn : filestomerge){
+         fnames.add(fn);
+      }
+      objParameters.put("files_to_merge", fnames);
+
+      JSONObject obj = new JSONObject();
+      obj.put("service",  "MERGE1"); // MERGE phase 1: create common header
+      obj.put("parameters",  objParameters);
+
+
+      return obj.toJSONString();
+   }
+
+   @SuppressWarnings("unchecked")
+   public static String mergefilesReprojectToJson(
+         String jobId,
+         String dimensionality, 
+         String fitsFileName)
+   {
+      JSONObject objParameters = new JSONObject();
+      objParameters.put("merge_id", jobId);
+      objParameters.put("dimensionality", dimensionality);
+      objParameters.put("fits_filename", fitsFileName);
+
+      JSONObject obj = new JSONObject();
+      obj.put("service",  "MERGE2"); // MERGE phase 2: reproject one fitsfile
+      obj.put("parameters",  objParameters);
+
+      return obj.toJSONString();
+   }
+
+   @SuppressWarnings("unchecked")
+   public static String mergefilesAddReprojectedToJson(
+         String jobId,
+         String dimensionality )
+   {
+      JSONObject objParameters = new JSONObject();
+      objParameters.put("merge_id", jobId);
+      objParameters.put("dimensionality", dimensionality);
+
+      JSONObject obj = new JSONObject();
+      obj.put("service",  "MERGE3"); // MERGE phase 3: add all reprojected files
+      obj.put("parameters",  objParameters);
+
+      return obj.toJSONString();
+   }
+
+   // END merge-parallel
+
+
+}
+
diff --git a/data-access/servlet/src/main/java/datasets/json-rpc/RpcOverAmqp.java b/data-access/servlet/src/main/java/datasets/json-rpc/RpcOverAmqp.java
new file mode 100644
index 0000000000000000000000000000000000000000..795408dd79d6213a5b4e7fc4abedde5bfe1fec67
--- /dev/null
+++ b/data-access/servlet/src/main/java/datasets/json-rpc/RpcOverAmqp.java
@@ -0,0 +1,131 @@
+
+// At each vlkb-request:
+// establish "connection" to RabbitMQ-broker (host:port) on autogenerated "channel" as user ???
+// then using this connection:channel do:
+// * create a reply queue with autogenerated name
+// * start a consumer on this queue
+// generate a message with properties: corrId & reply-queue
+// * publish the message to the pre-defined "amq.direct" exchange with routingKey from config
+// * start waiting on reply-queue for next delivery
+//
+// It is admins responsibility to configure routingKey in Java-client (see Settings) to the
+// same value as queuename used starting vlkbd to ensure  delivery
+// of vlkb-requests from Exchange to the correct queue
+
+
+
+import com.rabbitmq.client.ConnectionFactory;
+import com.rabbitmq.client.Connection;
+import com.rabbitmq.client.Channel;
+import com.rabbitmq.client.QueueingConsumer;
+import com.rabbitmq.client.AMQP.BasicProperties;
+import java.util.UUID;
+
+
+public class RpcOverAmqp
+{
+	private final boolean NO_ACK = true;
+	// affects message consume from queue:
+	// broker will remove msg right after delivery without waiting for confirmation
+	// improves performance on expense of reliability:
+
+	private String userName = "guest";
+	private String password = "guest";
+	private String hostName;
+	private int portNumber;
+	private String routingKey;
+
+	private Connection connection;
+	private Channel channel;
+	private String replyQueueName;
+	private QueueingConsumer consumer;
+
+ private int channelNumber;
+
+
+	RpcOverAmqp(String userName, String password, String hostName, int portNumber, String routingKey)
+	{
+		this.userName = userName;
+		this.password = password;
+		this.hostName = hostName;
+		this.portNumber = portNumber;
+		this.routingKey = routingKey;
+	}
+
+
+
+	public void initConnectionAndReplyQueue()
+	{
+		try
+		{
+			ConnectionFactory factory = new ConnectionFactory();
+			factory.setHost(hostName);
+			factory.setPort(portNumber);
+
+      factory.setUsername(userName);
+      factory.setPassword(password);
+
+      connection = factory.newConnection();
+			channel = connection.createChannel();
+
+      channelNumber = channel.getChannelNumber();
+
+			replyQueueName = channel.queueDeclare().getQueue();
+			consumer = new QueueingConsumer(channel);
+
+			// Start a non-nolocal, non-exclusive consumer, with a server-generated consumerTag.
+			channel.basicConsume(replyQueueName, NO_ACK, consumer);
+		}
+		catch(Exception e)
+		{
+			e.printStackTrace();
+		}
+	}
+
+
+
+	public String callAndWaitReply(String message) throws Exception {
+		String response = null;
+		String corrId = UUID.randomUUID().toString();
+
+		BasicProperties props = new BasicProperties
+			.Builder()
+			.correlationId(corrId)
+			.replyTo(replyQueueName)
+			.build();
+
+		// send rpc params and where to reply (reply-queue & corrId)
+
+		channel.basicPublish("", routingKey, props, message.getBytes("UTF-8"));
+		//channel.basicPublish("amq.direct", routingKey, props, message.getBytes("UTF-8"));
+
+		// wait for reply msg and return if corrId matches
+
+		while (true)
+		{
+
+			QueueingConsumer.Delivery delivery = consumer.nextDelivery();
+
+			System.out.println("CorrId sent[" + channelNumber + "]: "  + delivery.getProperties().getCorrelationId()
+					+ "\nCorrId recv: " + corrId
+					+ "\nreplyQueueName: " +  replyQueueName);
+
+			if (delivery.getProperties().getCorrelationId().equals(corrId))
+			{
+				response = new String(delivery.getBody(),"UTF-8");
+				break;
+			}
+		}
+
+		return response;
+	}
+
+
+
+	public void close() throws Exception
+	{
+		connection.close();
+	}
+
+}
+
diff --git a/data-access/servlet/src/main/java/resolver/Resolver.java b/data-access/servlet/src/main/java/resolver/Resolver.java
new file mode 100644
index 0000000000000000000000000000000000000000..f9b762d076eb68dcd6e28e498978d5c0684ab8c9
--- /dev/null
+++ b/data-access/servlet/src/main/java/resolver/Resolver.java
@@ -0,0 +1,11 @@
+
+
+interface Resolver
+{
+   public void resolve(String id);
+
+   public String relPathname();
+   public int    hdunum();
+}
+
+
diff --git a/data-access/servlet/src/main/java/resolver/ResolverByObsCore.java b/data-access/servlet/src/main/java/resolver/ResolverByObsCore.java
new file mode 100644
index 0000000000000000000000000000000000000000..b2a101d427681a390c082a244eec8d150839f7d7
--- /dev/null
+++ b/data-access/servlet/src/main/java/resolver/ResolverByObsCore.java
@@ -0,0 +1,124 @@
+
+import java.util.logging.Logger;
+
+
+
+class ResolverByObsCore implements Resolver
+{
+   private static final Logger LOGGER = Logger.getLogger(ResolverByObsCore.class.getName());
+
+   private Settings.DBConn dbConn;
+   private Subsurvey[] subsurveys;
+
+   private String relPathname;
+   private int hdunum;
+   private String subsurveyId;
+   private String accessUrl;
+
+
+   public ResolverByObsCore(Settings.DBConn dbConn, Subsurvey[] subsurveys)
+   {
+      this.dbConn = dbConn;
+      this.subsurveys = subsurveys;
+   }
+
+
+   public String relPathname() { return this.relPathname; }
+   public int    hdunum() { return this.hdunum; }
+   public String obsCollection() { return this.subsurveyId; }
+
+
+   public void resolve(String pubdid)
+   {
+      LOGGER.info("trace " + pubdid);
+      try
+      {
+         resolveByMapping(pubdid, dbConn);
+      }
+      catch(ClassNotFoundException e)
+      {
+         LOGGER.info("DB driver class was not loaded. No database connection.");
+      }
+      LOGGER.info("relPathname   : " + relPathname);
+      LOGGER.info("hdunum        : " + String.valueOf(hdunum));
+      LOGGER.info("obsCollection : " + this.subsurveyId);
+   }
+
+
+
+   private void resolveByMapping(String pubdid, Settings.DBConn dbConn) throws ClassNotFoundException
+   {
+      LOGGER.info("trace " + pubdid);
+
+      if(this.subsurveys == null)
+      {
+         throw new IllegalStateException("subsurveys metadata is missing however storage-path is needed for resolve-by-mapping");
+      }
+
+      this.accessUrl   = null;
+      this.subsurveyId = null; // VLKB stores subsurveyId in ObsCOre::obsCollection
+      db_queryAccessUrlAndObsCollection(dbConn, pubdid);//, this.accessUrl, this.subsurveyId);
+
+      if((accessUrl == null) || ((accessUrl != null) && (accessUrl.length() < 1)))
+      {
+         throw new IllegalStateException("ObsCore::accessUrl needed but is null or empty");
+      }
+
+      String fileName = accessUrl.substring( accessUrl.lastIndexOf('/') + 1, accessUrl.length() );
+
+      /* NOTE accessURL & IVOID have after hash Extension-number:  hdunum = 1 + extnum */
+
+      int ixTo = fileName.lastIndexOf("#");// returns -1 if hash not found
+      boolean hash_not_found = (ixTo < 0);
+
+      if(hash_not_found)
+      {
+         hdunum = 1;
+      }
+      else if( (ixTo + 1) == fileName.length() )
+      {
+         hdunum = 1; // filenames ending with hash seen as filename without extension number
+      }
+      else
+      {
+         hdunum = 1 + Integer.parseInt(fileName.substring(ixTo + 1));
+         fileName = fileName.substring(0, ixTo);
+      }
+
+      Subsurvey subsurvey = Subsurvey.findSubsurvey(this.subsurveys, this.subsurveyId);
+      String storagePath = subsurvey.storage_path;
+
+      if((storagePath == null) || ((storagePath != null) && (storagePath.length() < 1)))
+      {
+         relPathname = fileName;
+      }
+      else
+      {
+         relPathname = storagePath + "/" + fileName;
+      }
+   }
+
+
+   private void db_queryAccessUrlAndObsCollection(Settings.DBConn dbConn, String pubdid)//, String accessUrl, String obsCollection)
+      throws ClassNotFoundException
+   {
+      LOGGER.info("trace");
+
+      ResolverByObsCoreDb rdb;
+      synchronized(ResolverByObsCoreDb.class)
+      {
+         ResolverByObsCoreDb.dbconn = dbConn;
+         rdb = new ResolverByObsCoreDb();
+      }
+
+      String[] results = rdb.queryAccessUrlAndObsCollection(pubdid);
+      if(results == null || ((results != null) && (results.length != 2)))
+      {
+         throw new IllegalStateException("query for accessUrl and obsCollection fails for: "+ pubdid +" in "+ dbConn.uri());
+      }
+
+      this.accessUrl   = results[0];
+      this.subsurveyId = results[1];
+   }
+
+}
diff --git a/data-access/servlet/src/main/java/resolver/ResolverByObsCoreDb.java b/data-access/servlet/src/main/java/resolver/ResolverByObsCoreDb.java
new file mode 100644
index 0000000000000000000000000000000000000000..5f80c98dda71e83bd084b7bfefcf4fbbc76cc7a5
--- /dev/null
+++ b/data-access/servlet/src/main/java/resolver/ResolverByObsCoreDb.java
@@ -0,0 +1,191 @@
+
+import java.util.logging.Logger;
+
+// mySQL access
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.Driver;
+import java.sql.ResultSet;
+import java.sql.Statement;
+import java.sql.SQLException;
+import java.sql.Array;
+// import javax.sql.*; needed if using DataSource instead of DriverManager for DB-connections
+
+import java.net.MalformedURLException;
+import java.net.URL;
+import java.net.URLClassLoader;
+
+import java.util.Enumeration;
+import java.util.List;
+import java.util.LinkedList;
+import java.util.Set;
+import java.util.HashSet;
+import java.util.ArrayList;
+
+
+
+
+public class ResolverByObsCoreDb
+{
+   private static final Logger LOGGER = Logger.getLogger(ResolverByObsCoreDb.class.getName());
+
+   private static final Settings settings = Settings.getInstance();
+   public static Settings.DBConn dbconn = settings.dbConn;
+   private static final String DB_DRIVER = "org.postgresql.Driver";
+
+   private Connection conn;
+   private Statement  st;
+   private ResultSet  res;
+
+
+   ResolverByObsCoreDb() throws ClassNotFoundException
+   {
+      conn = null;
+      st   = null;
+      res  = null;
+      //LOGGER.info("Loading DB driver: " + DB_DRIVER);
+      //Class.forName(DB_DRIVER);
+   }
+
+
+   public String[] queryAccessUrlAndObsCollection(String pubdid)
+   {
+      String obsPubdid =  pubdid;
+
+      String TheQuery = "SELECT access_url,obs_collection FROM obscore "
+         + "WHERE obs_publisher_did = \'"+ obsPubdid +"\'";
+      LOGGER.info(TheQuery);
+
+      String[] results = new String[2];
+
+      LOGGER.info("Connecting to: " + dbconn.uri() + " with optional user/pwd: " + dbconn.userName() + " / " + dbconn.password() );
+
+      try(
+            Connection conn = DriverManager.getConnection(dbconn.uri(), dbconn.userName(), dbconn.password());
+            Statement  st   = conn.createStatement();
+            ResultSet  res  = st.executeQuery(TheQuery);)
+      {
+         //res = doQuery(TheQuery);
+
+         if(res == null)
+         {
+            LOGGER.info("Pubdid not in the db: " + pubdid);
+            return null;
+         };
+
+         int count = 0;
+         while (res.next())
+         {
+            count++;
+            results[0] = res.getString("access_url").strip();
+            results[1] = res.getString("obs_collection").strip();
+         }
+
+         assert(count == 1);// pubdid is unique
+
+      }
+      catch (SQLException se)
+      {
+         logSqlExInfo(se);
+         se.printStackTrace();
+      }
+/*      catch (ClassNotFoundException e)
+      {
+         LOGGER.info("DB driver "+ DB_DRIVER +" not found: " + e.getMessage());
+         e.printStackTrace();
+      }
+            finally
+              {
+              closeAll();
+              }
+              */
+      return results; 
+   }
+
+
+   /*
+      private void closeAll() {
+
+      try {
+      if(res  != null ) res.close();
+      if(st   != null ) st.close();
+      if(conn != null ) conn.close();
+      } catch (Exception e){
+   // FIXME print ?warning? into Glassfish SQLdriver close failed
+   e.printStackTrace();
+      }
+
+      }
+      */
+
+
+   private void logSqlExInfo(SQLException se)
+   {
+      LOGGER.info("SQLState : " + se.getSQLState());
+      LOGGER.info("ErrorCode: " + se.getErrorCode());
+      LOGGER.info("Message  : " + se.getMessage());
+      Throwable t = se.getCause();
+      while(t != null) {
+         LOGGER.info("Cause: " + t);
+         t = t.getCause();
+      }
+   }
+
+
+   /*
+      private ResultSet doQuery(String TheQuery) throws SQLException, ClassNotFoundException
+      {
+   /* https://docs.oracle.com/javase/tutorial/jdbc/basics/connecting.html :
+    * Any JDBC 4.0 drivers that are found in your class path are automatically loaded.
+    * (However, you must manually load any drivers prior to JDBC 4.0 with the method
+    * Class.forName.)
+    * /
+//    Class.forName(DB_DRIVER);
+   /* OR
+    * DriverManager.registerDriver(new org.postgresql.Driver());
+    * LOGGER.info(getClasspathString());
+    * LOGGER.info(getRegisteredDriverList());
+    * /
+
+    LOGGER.info("Connecting to: " + dbconn.uri() + " with optional user/pwd: " + dbconn.userName() + " / " + dbconn.password() );
+
+//    Connection conn = DriverManager.getConnection(dbconn.uri(), dbconn.userName(), dbconn.password());
+
+    Statement  st   = conn.createStatement();
+
+    ResultSet  res  = st.executeQuery(TheQuery);
+
+    return res;
+      }
+   // Retursn the list of JDBC Drivers loaded by the caller's class loader
+   private String getRegisteredDriverList()
+   {
+   StringBuffer drvList = new StringBuffer("getRegisteredDriverList:\r\n");
+   for (Enumeration e = DriverManager.getDrivers();
+   e.hasMoreElements(); )
+   {
+   Driver d = (Driver) e.nextElement();
+   String driverClass = d.getClass().getName();
+   drvList.append(driverClass).append("\r\n");	
+   }
+   return drvList.toString();
+   }
+
+   /*
+   public String getClasspathString() {
+   StringBuffer classpath = new StringBuffer("getClasspathString:\r\n");
+   ClassLoader applicationClassLoader = this.getClass().getClassLoader();
+   if (applicationClassLoader == null) {
+   applicationClassLoader = ClassLoader.getSystemClassLoader();
+   }
+   URL[] urls = ((URLClassLoader)applicationClassLoader).getURLs();
+   for(int i=0; i < urls.length; i++) {
+   classpath.append(urls[i].getFile()).append("\r\n");
+   }
+
+   return classpath.toString();
+   }
+   */
+
+
+}
diff --git a/data-access/servlet/src/main/java/resolver/ResolverFromId.java b/data-access/servlet/src/main/java/resolver/ResolverFromId.java
new file mode 100644
index 0000000000000000000000000000000000000000..c78fbcd6dd5eaae6d8ad56bcdb2a0696f6dd750e
--- /dev/null
+++ b/data-access/servlet/src/main/java/resolver/ResolverFromId.java
@@ -0,0 +1,67 @@
+
+import java.util.logging.Logger;
+
+
+
+class ResolverFromId implements Resolver
+{
+   private static final Logger LOGGER = Logger.getLogger(ResolverFromId.class.getName());
+
+   private String   relPathname;
+   private int      hdunum;
+
+
+   public String relPathname() {return this.relPathname;}
+   public int    hdunum() {return this.hdunum;}
+
+
+   public void resolve(String pubdid)
+   {
+      LOGGER.info("trace " + pubdid);
+
+      boolean isIvoid = pubdid.substring(0,6).equals("ivo://");
+
+      if(isIvoid)
+      {
+         resolveIvoid(pubdid);
+
+         LOGGER.info("relPathname : " + relPathname);
+         LOGGER.info("hdunum      : " + String.valueOf(hdunum));
+      }
+      else
+      {
+         throw new IllegalArgumentException("IVOID expected: ID must start with 'ivo://' but received: " + pubdid);
+      }
+   }
+
+
+
+   private void resolveIvoid(String pubdid)
+   {
+      LOGGER.info("trace " + pubdid);
+
+      int qmarkIx = pubdid.lastIndexOf("?");
+      int dhashIx = pubdid.lastIndexOf("#");// returns -1 if hash not found
+
+      boolean hash_not_found = (dhashIx < 0);
+
+      if(hash_not_found)
+      {
+         relPathname = pubdid.substring(  qmarkIx + 1 );
+         hdunum = 1;
+      }
+      else
+      {
+         relPathname = pubdid.substring(  qmarkIx + 1, dhashIx );
+
+         if((dhashIx+1) == pubdid.length())
+            throw new IllegalArgumentException(
+                  "if ID's last hash must be followed by HDU extension number however: " + pubdid);
+         else
+            hdunum = 1 + Integer.parseInt( pubdid.substring( dhashIx + 1 ) );
+      }
+   }
+
+
+}
+
diff --git a/data-access/servlet/src/main/java/vosi/VlkbServletFile.java b/data-access/servlet/src/main/java/vosi/VlkbServletFile.java
new file mode 100644
index 0000000000000000000000000000000000000000..74f531f56a32a8d3103d034f7f180d1b6c269aba
--- /dev/null
+++ b/data-access/servlet/src/main/java/vosi/VlkbServletFile.java
@@ -0,0 +1,150 @@
+//
+// return content of xml 
+// (used for VOSI capabilityVOSI and availability.xml)
+//
+
+import java.io.IOException;
+import java.io.PrintWriter;
+import java.io.File;
+import java.io.OutputStream;
+import java.util.Enumeration;
+import java.util.*; // ArrayList<String>
+
+import javax.servlet.ServletConfig;
+import javax.servlet.ServletException;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+import javax.servlet.ServletOutputStream; // for SOOA
+
+// from vlkb_mergefiles.java - dir & file handling
+import java.io.*;
+
+import java.nio.file.*;
+import static java.nio.file.StandardCopyOption.*;
+
+
+// serve VOSI resources from xml files (for now implemented as strings, not files FIXME)
+
+public class VlkbServletFile
+    extends javax.servlet.http.HttpServlet
+{
+    // for logs and debug
+    String className = this.getClass().getSimpleName();
+
+// VOSI
+// String accessURL = null; // FIXME now read from MERGEURL later introduce own param
+// String funcName = "vlkb_cutout"; // FIXME read from config file
+
+	private static final String availStr = 
+		  "<?xml version=\"1.0\" encoding=\"UTF-8\"?>"
+		+ "<vosi:availability  "
+		+ " xmlns:vosi=\"http://www.ivoa.net/xml/VOSIAvailability/v1.0\">"
+		+ " <vosi:available>true</vosi:available>"
+		+ " <vosi:note>service is accepting queries</vosi:note>"
+		+ "</vosi:availability>";
+
+	private String capsStr = null;
+
+
+	protected void SetCapsStr(String URL, String funcName)
+	{
+		if(URL != null)
+		{
+		
+            String accessURL = stripTrailingSlash(URL);
+
+    capsStr =
+		  "<?xml version=\"1.0\" encoding=\"UTF-8\"?>"
+
+		+ "<vosi:capabilities "
+		+    "xmlns:vosi=\"http://www.ivoa.net/xml/VOSICapabilities/v1.0\" "
+		+    "xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" "
+		+    "xmlns:vod=\"http://www.ivoa.net/xml/VODataService/v1.1\">"
+
+		+ " <capability standardID=\"ivo://ivoa.net/std/VOSI#capabilities\">"
+		+ "   <interface xsi:type=\"vod:ParamHTTP\" version=\"1.0\">"
+		+ "     <accessURL use=\"full\">"
+		+          accessURL + "/capabilities"
+		+ "     </accessURL>"
+		+ "   </interface>"
+		+ " </capability>"
+
+		+ " <capability standardID=\"ivo://ivoa.net/std/VOSI#availability\">"
+		+ "   <interface xsi:type=\"vod:ParamHTTP\" version=\"1.0\">"
+		+ "     <accessURL use=\"full\">"
+		+          accessURL + "/availability"
+		+ "     </accessURL>"
+		+ "   </interface>"
+		+ " </capability>"
+
+		+ " <capability standardID=\"ivo://ivoa.net/std/SODA#sync-1.0\">"
+		+ "   <interface xsi:type=\"vod:ParamHTTP\" role=\"std\" version=\"1.0\">"
+		+ "     <accessURL use=\"full\">"
+		+          accessURL + "/" + funcName
+		+ "     </accessURL>"
+		+ "   </interface>"
+		+ " </capability>"
+
+		+ " <capability standardID=\"ivo://ivoa.net/std/SODA#async-1.0\">"
+		+ "   <interface xsi:type=\"vod:ParamHTTP\" role=\"std\" version=\"1.0\">"
+		+ "     <accessURL use=\"full\">"
+		+          accessURL + "/" + funcName + "_uws/soda_cuts"
+		+ "     </accessURL>"
+		+ "   </interface>"
+		+ " </capability>"
+
+	+ "</vosi:capabilities>";
+		}
+	}
+
+
+	String stripTrailingSlash(String path)
+	{
+       		if (path.endsWith("/"))
+           		return path.substring(0,path.length()-1);
+       		else
+           		return path;
+    	}
+
+
+    protected void doGet(HttpServletRequest request,
+                         HttpServletResponse response)
+        throws ServletException, IOException {
+
+            doPost(request, response);
+        }
+
+
+
+    protected void doPost(HttpServletRequest request,
+                          HttpServletResponse response)
+        throws ServletException, IOException
+	{
+		StringBuffer requestURL = request.getRequestURL();
+	
+   	System.out.println(className + " vlkb req from: " + request.getRemoteAddr()
+                               + " doGet: " + requestURL.toString());
+
+		PrintWriter writer = response.getWriter();
+		response.setContentType("text/xml");
+	
+		if(-1 != requestURL.lastIndexOf("/capabilities"))
+		{
+            String fullURL = request.getRequestURL().toString();
+            String baseURL = fullURL.substring(0,requestURL.lastIndexOf("/"));
+
+            SetCapsStr(baseURL, "soda");
+			writer.println(capsStr);	
+
+		}
+		else if(-1 != requestURL.lastIndexOf("/availability"))
+		{
+			writer.println(availStr);	
+		}
+		// error FIXME what to do if none of above given ? e.g. misconfigured web.xml
+		
+		writer.close();
+		return;
+        }
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/AuthZFilter.java b/data-access/servlet/src/main/java/webapi/AuthZFilter.java
new file mode 100644
index 0000000000000000000000000000000000000000..c2b94e1f7430f3fb4c9e18411474a9e870a74785
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/AuthZFilter.java
@@ -0,0 +1,147 @@
+
+//import it.inaf.ia2.aa.data.User;
+
+import java.io.IOException;
+import java.io.BufferedReader;
+import java.io.InputStreamReader;
+import java.util.*; // ArrayList<String> Collection<>
+
+import java.util.logging.Logger;
+import javax.servlet.Filter;
+import javax.servlet.FilterChain;
+import javax.servlet.FilterConfig;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+import javax.servlet.http.Part;
+
+
+import javax.servlet.http.HttpServletRequestWrapper;
+import java.security.Principal;
+
+
+/* response wrapper */
+
+import javax.servlet.http.HttpServletResponseWrapper;
+import javax.servlet.ServletOutputStream;
+
+import java.io.PrintWriter;
+import java.io.StringWriter;
+import java.io.OutputStream;
+import java.io.DataOutputStream;
+import java.io.ByteArrayOutputStream;
+
+
+
+class AuthZ
+{
+   private static final Logger LOGGER = Logger.getLogger("AuthZ");
+   private static final AuthZSettings settings = AuthZSettings.getInstance("authpolicy.properties");
+
+   List<String> pubdidList = new ArrayList<String>();
+
+   String servletPath;
+
+
+   public AuthZ(HttpServletRequest req) throws IOException, ServletException
+   {
+      LOGGER.info("constructor");
+
+      String[] pubdidArr = req.getParameterValues("ID");
+      if(pubdidArr == null)
+      {
+         String pubdids = req.getParameter("pubdid");
+         if(pubdids != null) pubdidArr = pubdids.split(";");
+      }
+
+      if(pubdidArr != null)
+      {
+         for(String pubdid : pubdidArr)
+            if(pubdid.length() > 0) pubdidList.add(pubdid);
+
+         LOGGER.info("pubdids: " + String.join(" ", pubdidList));
+      }
+   }
+
+
+   private String getValue(Part part) throws IOException
+   {
+      BufferedReader reader = new BufferedReader(new InputStreamReader(part.getInputStream(), "UTF-8"));
+      StringBuilder value = new StringBuilder();
+      char[] buffer = new char[1024];
+      for (int length = 0; (length = reader.read(buffer)) > 0;)
+      {
+         value.append(buffer, 0, length);
+      }
+      return value.toString();
+   }
+
+
+
+   public boolean isAuthorized(HttpServletRequest req)
+   {
+      LOGGER.info("isAuthorized");
+
+      AuthPolicy auth = null;
+      try
+      {
+         auth = new AuthPolicy(req.getUserPrincipal());
+      }
+      catch(IllegalArgumentException ex)
+      {
+         throw new IllegalArgumentException("Authorization : UserPrincipal is not of expected type");
+      }
+      String[] pubdidArr = pubdidList.toArray(new String[pubdidList.size()]);
+      String[] authorized_pubdids;
+      authorized_pubdids = auth.filterAuthorized(pubdidArr, settings.dbConn.uri(), settings.dbConn.userName(), settings.dbConn.password());
+
+      /* If multiplicity allowed (and in mcutout/merge):
+       * if one or more of pubdids not-authorized -> all request not authorized
+       * */
+      /* NOTE for now soda/vlkb_cutout does not allow multiplicity --> only one pubdid allowed */
+      return (authorized_pubdids.length == pubdidArr.length);
+   }
+
+}
+
+
+
+
+
+@javax.servlet.annotation.MultipartConfig
+public class AuthZFilter implements Filter
+{
+   private static final Logger LOGGER = Logger.getLogger("AuthZFilter");
+
+
+   @Override
+   public void init(FilterConfig fc) throws ServletException {}
+
+   @Override
+   public void destroy() {}
+
+   @Override
+   public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException
+   {
+      LOGGER.info("doFilter");
+
+      HttpServletRequest  req  = (HttpServletRequest)  request;
+      HttpServletResponse  resp = (HttpServletResponse)  response;
+
+      AuthZ authz = new AuthZ(req);
+
+      if(authz.isAuthorized(req))
+      {
+         chain.doFilter(request, response);
+      }
+      else
+      {
+         resp.setContentType("text/plain");
+         resp.sendError(HttpServletResponse.SC_FORBIDDEN, "Forbidden");
+      }
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/AuthZSettings.java b/data-access/servlet/src/main/java/webapi/AuthZSettings.java
new file mode 100644
index 0000000000000000000000000000000000000000..7cd7e6a49ab33ed3f6f97415b30dfeafbac11ff5
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/AuthZSettings.java
@@ -0,0 +1,83 @@
+
+import java.util.logging.Logger;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Properties;
+import java.io.PrintWriter;
+
+
+class AuthZSettings
+{
+   private static final Logger LOGGER = Logger.getLogger("AuthZSettings");
+
+   public static class DBConn
+   {
+      private String uri;
+      private String schema;
+      private String user_name;
+      private String password;
+
+      public String toString()
+      {
+         return uri() + " [" + schema + "] " + user_name + " / " + password  + " ";
+      }
+
+      public String uri() { return uri; }
+      public String schema() { return schema; }
+      public String userName() { return user_name; }
+      public String password() { return password; }
+   }
+
+   public DBConn     dbConn;
+
+
+   // will not start without config-file
+   // no reasonable code-defaults can be invented
+   public static AuthZSettings getInstance(String propertiesFilename)
+   {
+      try
+      {
+         InputStream ins =
+            AuthZSettings.class.getClassLoader().getResourceAsStream(propertiesFilename);
+
+         if (ins != null)
+         {
+            Properties properties = new Properties();
+            properties.load(ins);
+
+            DBConn    dbConn    = loadDBConn(properties);
+
+            return new AuthZSettings(dbConn);
+         }
+         else
+         {
+            throw new IllegalStateException(propertiesFilename + " not found in classpath");
+         }
+
+      }
+      catch(IOException ex)
+      {
+         throw new IllegalStateException("Error while loading " + propertiesFilename + " file", ex);
+      }
+   }
+
+
+
+   private AuthZSettings(DBConn dbConn)
+   {
+      this.dbConn    = dbConn;
+   }
+
+
+   private static DBConn loadDBConn(Properties properties)
+   {
+      DBConn dbconn = new AuthZSettings.DBConn();
+      dbconn.uri       = properties.getProperty("db_uri", "jdbc:postgresql://localhost:5432/vialactea").strip();
+      dbconn.schema    = properties.getProperty("db_schema", "datasets").strip();
+      dbconn.user_name = properties.getProperty("db_user_name", "").strip();
+      dbconn.password  = properties.getProperty("db_password", "").strip();
+      return dbconn;
+   }
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/MonitorFilter.java b/data-access/servlet/src/main/java/webapi/MonitorFilter.java
new file mode 100644
index 0000000000000000000000000000000000000000..c20a1dab8f97a79765c4d4b11aefa9d57a903cdb
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/MonitorFilter.java
@@ -0,0 +1,262 @@
+
+//import it.inaf.ia2.aa.data.User;
+
+import java.io.IOException;
+import java.io.BufferedReader;
+import java.io.InputStreamReader;
+import java.util.*; // ArrayList<String> Collection<>
+
+import java.util.logging.Logger;
+import javax.servlet.Filter;
+import javax.servlet.FilterChain;
+import javax.servlet.FilterConfig;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+import javax.servlet.http.Part;
+
+
+import javax.servlet.http.HttpServletRequestWrapper;
+import java.security.Principal;
+
+@javax.servlet.annotation.MultipartConfig
+public class MonitorFilter implements Filter
+{
+  private static final Logger LOGGER = Logger.getLogger(MonitorFilter.class.getName());
+  private static final Settings settings = Settings.getInstance();
+
+
+   @Override
+   public void init(FilterConfig fc) throws ServletException {}
+
+   @Override
+   public void destroy() {}
+
+   @Override
+   public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException
+   {
+      LOGGER.info("ENTER doFilter ====================================");
+
+      HttpServletRequest  req  = (HttpServletRequest)  request;
+      HttpServletResponse resp = (HttpServletResponse) response;
+
+      boolean readBody = false;                
+      //logServletEnv(req, resp, readBody);
+
+      String servletPath = req.getServletPath();
+
+      List<String> pubdidList = new ArrayList<String>();
+
+      if(servletPath.equals("/vlkb_cutout"))
+      {
+         String pubdid = req.getParameter("ID");
+         if(pubdid == null) pubdid = req.getParameter("pubdid");
+         if(pubdid != null) pubdidList.add(pubdid);
+         LOGGER.info("PARAM cut id: " + pubdid);
+      }
+      else if(servletPath.equals("/uws_merge") || servletPath.equals("/vlkb_merge"))
+      {
+         String pubdids = req.getParameter("pubdid");
+         LOGGER.info("PARAM mrg pubdid: " + pubdids);
+         String[] pubdidArr = pubdids.split(";");
+         for(String pubdid : pubdidArr) 
+            if(pubdid.length() > 0) pubdidList.add(pubdid);
+      }
+      else if(servletPath.equals("/uws_mcutout"))
+      {
+         final String METHOD = req.getMethod();
+         if(METHOD.equals("POST"))
+         {
+            try
+            {
+               Part part = req.getPart("mcutout");
+               if(part == null)
+               {
+                  LOGGER.info("part 'mcutout' is null");
+               }
+               else
+               {
+                  String body = getValue(part);
+                  LOGGER.info("PARAM mct B[mcutout]: " + body);
+                  String[] pubdidArr = JdlMCutout.pubdidsFromReqJson(body);
+                  for(String pubdid : pubdidArr) 
+                     if(pubdid.length() > 0) pubdidList.add(pubdid);
+               }
+            }
+            catch(ServletException ex)
+            {
+               LOGGER.info("getPart(mcutout) ServeltException: " + ex.getMessage());
+            }
+            catch(IOException ex)
+            {
+               LOGGER.info("getPart(mcutout) IOException: " + ex.getMessage());
+            }
+            catch(Exception ex)
+            {
+               LOGGER.info("getPart(mcutout) Exception: " + ex.getMessage());
+            }
+         }
+      }
+      else
+      {
+         LOGGER.info("ServletPath not used: " + servletPath);
+
+         LOGGER.info("CALL chain.doFilter");
+         chain.doFilter(request, response);
+         LOGGER.info("RETURN from chain.doFilter");
+
+         LOGGER.info("EXIT doFilter *************************************");
+         return;
+      }
+
+
+
+      int i = 0;
+      for(String pubdid : pubdidList)
+         LOGGER.info("pubdid[" + i++ + "]:" + pubdid);
+
+
+      LOGGER.info("AuthZ start --------------------------------------");
+      AuthPolicy auth = null;
+      try
+      {
+         auth = new AuthPolicy(req.getUserPrincipal());
+      }
+      catch(IllegalArgumentException ex)
+      {
+         throw new IllegalArgumentException("Authorization : UserPrincipal is not of expected type");
+      }
+      String[] pubdidArr = pubdidList.toArray(new String[pubdidList.size()]);
+      String[] authorized_pubdids;
+      LOGGER.info("Action cutout: for filterrAuthorized.");
+      authorized_pubdids = auth.filterAuthorized(pubdidArr, settings.dbConn.uri(), settings.dbConn.userName(), settings.dbConn.password());
+      LOGGER.info("AuthZ end ---------------------------------------");
+
+      i = 0;
+      for(String pubdid : authorized_pubdids)
+         LOGGER.info("authZpubdid[" + i++ + "]:" + pubdid);
+
+      if(servletPath.equals("/vlkb_cutout") && (authorized_pubdids.length < 1))
+      {
+         LOGGER.info("FORBIDDEN Authorization error");
+         resp.setContentType("text/plain");
+         resp.sendError(HttpServletResponse.SC_FORBIDDEN, "Forbidden");
+      }
+      else
+      {
+         LOGGER.info("CALL chain.doFilter");
+         chain.doFilter(request, response);
+         LOGGER.info("RETURN from chain.doFilter");
+      }
+
+      LOGGER.info("EXIT doFilter *************************************");
+
+   }
+
+
+
+
+   private static String getValue(Part part) throws IOException
+   {
+      BufferedReader reader = new BufferedReader(new InputStreamReader(part.getInputStream(), "UTF-8"));
+      StringBuilder value = new StringBuilder();
+      char[] buffer = new char[1024];
+      for (int length = 0; (length = reader.read(buffer)) > 0;)
+      {
+         value.append(buffer, 0, length);
+      }
+      return value.toString();
+   }
+
+
+
+
+
+   private void logServletEnv(HttpServletRequest  req, HttpServletResponse resp, boolean readBody)
+   {
+
+      Principal princ = req.getUserPrincipal();
+      LOGGER.info("getUserPrincipal available  : " + ((princ == null) ? "NO" : "YES"));
+      String H_AUTH = req.getHeader("Authorization");
+      if(H_AUTH == null )
+         LOGGER.info("getHeader[Authorization][-] : null");
+      else
+         LOGGER.info("getHeader[Authorization][" + H_AUTH.length() +"] :" + H_AUTH.substring(0,10) + " ..." );
+
+      /* Servlet.getServletConfig(), ServletConfig.getServletContext() */
+      /* ServeltContext.getContextPath() ServletContext.getRealPath() if war-name differs from ContextPath */
+      LOGGER.info("getContextPath    : " + req.getContextPath()); // usually the war-file name
+      LOGGER.info("getServletPath    : " + req.getServletPath());
+      LOGGER.info("getPathInfo       : " + req.getPathInfo()); // portion after context and before query-string
+      LOGGER.info("getPathTranslated : " + req.getPathTranslated()); // extra path translated to local path
+      LOGGER.info("getRequestURI     : " + req.getRequestURI());
+      LOGGER.info("getRequestURL     : " + req.getRequestURL().toString());
+      LOGGER.info("getQueryString    : " + req.getQueryString());
+      LOGGER.info("getContentType    : " + req.getContentType());
+
+
+      String METHOD = req.getMethod();
+      LOGGER.info("getMethod     : " + METHOD);
+
+
+      Map<String, String[]> map = req.getParameterMap();
+      for (Map.Entry<String, String[]> entry : map.entrySet())
+      {
+         String[] strArr = entry.getValue();
+         for(String str : strArr)
+            LOGGER.info("PARAM: " + entry.getKey() + ":" + str);
+      }
+
+
+      /* MultipartConfig.maxFileSize(), MultipartConfig.maxRequestSize() */
+      if(METHOD.equals("POST"))
+      {
+         try
+         {
+            /* for getParts() call to work in Tomcat the following config in context.xml is needed:
+             * <Context allowCasualMultipartParsing="true"> ...
+             */
+            Collection<Part> parts = req.getParts();
+            if(parts == null)
+            {
+               LOGGER.info("parts is null");
+            }
+            else
+            {
+               for(Part part : parts)
+               {
+                  LOGGER.info("part.getName              : " + part.getName());
+                  LOGGER.info("part.getSize              : " + part.getSize());
+                  LOGGER.info("part.getHeader            : " + part.getHeader(part.getName()));
+                  LOGGER.info("part.getContentType       : " + part.getContentType());
+                  LOGGER.info("part.getSubmittedFileName : " + part.getSubmittedFileName());
+
+                  if (readBody && part.getName().equals("mcutout"))
+                  {
+                     String body = getValue(part);
+                     LOGGER.info("BODY: " + body);
+                  }
+
+               }
+            }
+         }
+         catch(ServletException ex)
+         {
+            LOGGER.info("getParts ServeltException: " + ex.getMessage());
+         }
+         catch(IOException ex)
+         {
+            LOGGER.info("getParts IOException: " + ex.getMessage());
+         }
+         catch(Exception ex)
+         {
+            LOGGER.info("getParts Exception: " + ex.getMessage());
+         }
+      }
+   }
+
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/NeaLogEntry.java b/data-access/servlet/src/main/java/webapi/NeaLogEntry.java
new file mode 100644
index 0000000000000000000000000000000000000000..2ace3cc4f2ddbbe4e734c99e2c4833033c5fd470
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/NeaLogEntry.java
@@ -0,0 +1,81 @@
+
+import org.json.simple.JSONObject;
+import java.text.SimpleDateFormat;
+import java.util.Date;
+import java.util.TimeZone;
+
+
+
+
+public class NeaLogEntry
+{
+   String serviceId = "vlkb";
+   String action;
+   String defaultLevel;
+
+
+   NeaLogEntry(String argAction, String argDefLevel)
+   {
+      action         = argAction;
+      defaultLevel   = argDefLevel;
+   }
+
+
+
+   String timeNowISO8601()
+   {
+      TimeZone tz = TimeZone.getTimeZone("UTC");
+      SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'");
+      df.setTimeZone(tz);
+      String timeAsISO8601 = df.format(new Date());
+      return timeAsISO8601;
+   }
+
+
+
+
+
+   JSONObject generateCommonInfo(String timestamp, String UserName, String resource)
+   {
+      JSONObject objCommon = new JSONObject();
+
+      objCommon.put("timestamp", timestamp);
+      objCommon.put("serviceid", serviceId);
+      objCommon.put("level", defaultLevel);
+
+      objCommon.put("resource", resource);
+      objCommon.put("action", action);
+      objCommon.put("userid", UserName);
+
+      return objCommon;
+   }
+
+
+
+   String generateLoggingInfo(String timestamp, String UserName,
+         String resource, String requestUrl, String clientIp)
+   {
+      JSONObject objLogging = generateCommonInfo(timestamp, UserName, resource);
+
+      objLogging.put("url", requestUrl);
+      objLogging.put("client_ip", clientIp);
+ 
+      return objLogging.toString();
+   }
+
+
+
+
+   String generateAccountingInfo(String timestamp, String UserName, String resource,
+         String measure, double value)
+   {
+      JSONObject objAccounting = generateCommonInfo(timestamp, UserName, resource);
+
+      objAccounting.put("value", value);
+      objAccounting.put("measure", measure);
+
+      return objAccounting.toString();
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/ServletCutout.java b/data-access/servlet/src/main/java/webapi/ServletCutout.java
new file mode 100644
index 0000000000000000000000000000000000000000..26e673e427c374d654c2dbe7a5c55ec56d59c969
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/ServletCutout.java
@@ -0,0 +1,482 @@
+
+import java.util.logging.Logger;
+
+import java.security.Principal;
+
+import javax.servlet.ServletException;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+import javax.servlet.ServletOutputStream;
+
+import java.io.OutputStreamWriter;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.io.PrintWriter;
+
+import java.io.UnsupportedEncodingException;
+import java.net.URLDecoder;
+
+/* for streaming the cutout-file */
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileNotFoundException;
+
+import java.util.Arrays;
+import java.util.List;
+import java.util.LinkedList;
+import java.util.Map;
+import java.util.HashMap;
+import java.util.Properties;
+
+// for Logging/Accounting
+import org.json.simple.JSONObject;
+import java.text.SimpleDateFormat;
+import java.util.Date;
+import java.util.TimeZone;
+
+
+import java.nio.file.StandardOpenOption;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+
+
+public class ServletCutout extends javax.servlet.http.HttpServlet
+{
+   protected static final Logger   LOGGER   = Logger.getLogger(ServletCutout.class.getName());
+   protected static final Settings settings = Settings.getInstance();
+
+   final String RESPONSE_ENCODING = "utf-8";
+   final String DEFAULT_RESPONSEFORMAT = settings.defaults.responseFormat;
+   final String DEFAULT_SKY_SYSTEM     = settings.defaults.skySystem;
+   final String DEFAULT_SPEC_SYSTEM    = settings.defaults.specSystem;
+   final String DEFAULT_TIME_SYSTEM = "MJD_UTC"; // FIXME take from confif file
+
+   boolean showDuration = settings.defaults.showDuration;
+   long startTime_msec;
+
+   protected Cutout cutout = new CutoutImpl(settings);
+
+   private Subsurvey[] subsurveys = null;
+
+   public void init() throws ServletException
+   {
+      super.init();
+
+      LOGGER.info("AMQP : " + settings.amqpConn.toString());
+      LOGGER.info("FITS : " + settings.fitsPaths.toString());
+      String surveysAbsPathname = settings.fitsPaths.surveysMetadataAbsPathname();
+      if( (surveysAbsPathname != null) && (surveysAbsPathname.length() > 1) )
+         subsurveys = Subsurvey.loadSubsurveys(surveysAbsPathname);
+   }
+
+
+   protected void doSodaDescriptor(PrintWriter writer, String requestUrl)
+   {
+      String theDescriptor =
+         "<VOTABLE xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://www.ivoa.net/xml/VOTable/v1.3\" version=\"1.3\">"
+         + "<RESOURCE type=\"meta\" utype=\"adhoc:service\" name=\"this\">"
+         + "<PARAM name=\"standardID\" datatype=\"char\" arraysize=\"*\" value=\"ivo://ivoa.net/std/SODA#sync-1.0\"/>"
+         + "<PARAM name=\"accessURL\" datatype=\"char\" arraysize=\"*\" value=\"" + requestUrl  + "\"/>"
+         + "<GROUP name=\"inputParams\">"
+         +  "<PARAM name=\"ID\" ucd=\"meta.id;meta.dataset\" datatype=\"char\" arraysize=\"*\" value=\"\"/>"
+         +  "<PARAM name=\"POS\" ucd=\"pos.outline;obs\" datatype=\"char\" arraysize=\"*\" value=\"\"/>"
+         +  "<PARAM name=\"CIRCLE\" ucd=\"phys.angArea;obs\" unit=\"deg\" datatype=\"double\" arraysize=\"3\" xtype=\"circle\" value=\"\"/>"
+         +  "<PARAM name=\"POLYGON\" unit=\"deg\" ucd=\"pos.outline;obs\" datatype=\"double\" arraysize=\"*\" xtype=\"polygon\"  value=\"\"/>"
+         +  "<PARAM name=\"BAND\" ucd=\"em.wl;stat.interval\" unit=\"m\" datatype=\"double\" arraysize=\"2\" xtype=\"interval\" value=\"\"/>"
+         +  "<PARAM name=\"TIME\" ucd=\"time.interval;obs.exposure\" unit=\"d\" datatype=\"double\" arraysize=\"2\" xtype=\"interval\" value=\"\"/>"
+         +  "<PARAM name=\"POL\" ucd=\"meta.code;phys.polarization\" datatype=\"char\" arraysize=\"*\" value=\"\"/>"
+         +  "<PARAM name=\"RESPONSEFORMAT\" ucd=\"meta.code.mime\" datatype=\"char\" arraysize=\"*\" value=\"application/fits\"/>"
+         + "</GROUP>"
+         + "</RESOURCE>"
+         + "</VOTABLE>";
+
+      writer.println(theDescriptor);
+   }
+
+
+   protected void doCutoutStream(String id, Pos pos, Band band, Time time, Pol pol,
+         OutputStream respOutputStream) throws IOException, InterruptedException
+   {
+      LOGGER.info("trace" + pos);
+
+      Resolver rsl = new ResolverFromId();
+      rsl.resolve(id);
+
+      if(pos  != null) pos.setSystem(Pos.System.valueOf(DEFAULT_SKY_SYSTEM));
+      if(band != null) band.setSystem(Band.System.valueOf(DEFAULT_SPEC_SYSTEM));
+      if(time != null) time.setSystem(Time.System.valueOf(DEFAULT_TIME_SYSTEM));
+
+      cutout.doStream(rsl.relPathname(), rsl.hdunum(), pos, band, time, pol, respOutputStream);
+   }
+
+
+   protected void doCutoutFileStream(String id, Pos pos, Band band, Time time, Pol pol, OutputStream respOutputStream)
+      throws IOException
+   {
+      LOGGER.info("trace" + pos);
+
+      Resolver rsl = new ResolverFromId();
+      rsl.resolve(id);
+
+
+
+      if(pos  != null) pos.setSystem(Pos.System.valueOf(DEFAULT_SKY_SYSTEM));
+      if(band != null) band.setSystem(Band.System.valueOf(DEFAULT_SPEC_SYSTEM));
+      if(time != null) time.setSystem(Time.System.valueOf(DEFAULT_TIME_SYSTEM));
+
+      CutResult cutResult = cutout.doFile(rsl.relPathname(), rsl.hdunum(), pos, band, time, pol, false, null);
+
+      Path path = Paths.get(cutResult.filename);
+      InputStream inStream = Files.newInputStream(path, StandardOpenOption.DELETE_ON_CLOSE);
+      inStream.transferTo(respOutputStream);
+      inStream.close();
+   }
+
+
+   protected DataLink doCutoutFile(String id, Pos pos, Band band, Time time, Pol pol,
+         boolean countNullValues, String respFormat)
+   {
+      LOGGER.info("trace");
+
+      String relPathname;
+      int hdunum;
+
+      String dbUri = settings.dbConn.uri();
+
+      if((dbUri == null) || dbUri.trim().isEmpty())
+      {
+         Resolver rsl = new ResolverFromId();
+         rsl.resolve(id);
+         relPathname = rsl.relPathname();
+         hdunum      = rsl.hdunum();
+      }
+      else
+      {
+         ResolverByObsCore rsl = new ResolverByObsCore(settings.dbConn, subsurveys);
+         rsl.resolve(id);
+         relPathname = rsl.relPathname();
+         hdunum      = rsl.hdunum();
+         String subsurveyId = rsl.obsCollection(); //this implementation assumes ObsCore::obs_collection holds ssID
+         FitsCard[] extraCards = null;
+         if(subsurveyId != null)
+         {
+            extraCards = Subsurvey.subsurveysFindCards(subsurveys, subsurveyId);
+         }
+         else
+         {
+            LOGGER.info("Resolver with Obscore returns subsurveyId null: no extraCards loaded.");
+         }
+         // FIXME use of extraCards not implemented
+      }
+
+      final String DEFAULT_TIME_SYSTEM = "MJD_UTC"; // FIXME take from confif file
+
+      if(pos  != null) pos.setSystem(Pos.System.valueOf(DEFAULT_SKY_SYSTEM));
+      if(band != null) band.setSystem(Band.System.valueOf(DEFAULT_SPEC_SYSTEM));
+      if(time != null) time.setSystem(Time.System.valueOf(DEFAULT_TIME_SYSTEM));
+
+      CutResult cutResult = cutout.doFile(relPathname, hdunum, pos, band, time, pol, false, null);
+
+      DataLink dlk = new DataLink();
+
+      dlk.id            = id;
+      dlk.accessUrl     = dlk.convertLocalPathnameToRemoteUrl(cutResult.filename,
+            settings.fitsPaths.cutouts(), settings.fitsPaths.cutoutsUrl());
+      dlk.serviceDef    = null;
+      dlk.errorMessage  = null;
+      dlk.description   = "A cutout from " + id;// + " by parameters "
+                                                // + pos.toString() + " " + band.toString() + " " + time.toString() + " " + pol.toString();
+      dlk.semantics     = "http://www.ivoa.net/rdf/datalink/core#proc#cutout";
+      dlk.contentType   = "application/fits";
+      dlk.contentLength = cutResult.filesize;
+
+      // VLKB-extension to DataLink:
+      Coord coord = new Coord(DEFAULT_SKY_SYSTEM, pos, DEFAULT_SPEC_SYSTEM, band, time, pol);
+      LOGGER.info(coord.toString());
+
+      dlk.inputs         = new Inputs(id, coord, countNullValues);
+      dlk.versionString  = Version.asString;
+      dlk.cut            = null;
+      dlk.absCutPathname = cutResult.filename;
+      dlk.datacubeCount  = 1;
+      dlk.nullVals       = ((cutResult.nullValueCount.percent < 0) || (cutResult.nullValueCount.totalCount < 1)) ?
+         null : cutResult.nullValueCount;
+      dlk.mcutResultArr  = null;
+
+      return dlk;
+   }
+
+
+   protected void doMultiValuedParamNotSupported(String message, PrintWriter printWriter)
+   {
+      printWriter.println("MultiValuedParamNotSupported : " + message);
+   }
+
+   protected void doUsageError(String message, PrintWriter printWriter)
+   {
+      printWriter.println("UsageError : " + message);
+   }
+
+   protected void doError(String message, PrintWriter printWriter)
+   {
+      printWriter.println("Error : " + message);
+   }
+
+
+
+   /* HTTP/J2EE -> SODA */
+
+
+   /* DALI allows GET and POST for sync services */
+
+   protected void doGet(HttpServletRequest request, HttpServletResponse response)
+         throws ServletException, IOException, UnsupportedEncodingException
+      {
+         startTime_msec = System.currentTimeMillis();
+
+         final boolean NO_QUERY_STRING = (request.getQueryString() == null);
+
+         if(NO_QUERY_STRING)
+         {
+            writeSodaDescriptor(request, response);
+            LOGGER.info("normal exit with SODA service descriptor");
+            return;
+         }
+         else
+         {
+            convertHttpToSoda(request, response);
+            LOGGER.info("normal exit");
+         }
+      }
+
+   protected void doPost(HttpServletRequest request, HttpServletResponse response)
+         throws ServletException, IOException, UnsupportedEncodingException
+      {
+         startTime_msec = System.currentTimeMillis();
+
+         final boolean NO_QUERY_STRING = (request.getQueryString() == null);
+
+         if(NO_QUERY_STRING)
+         {
+            writeSodaDescriptor(request, response);
+            LOGGER.info("normal exit with SODA service descriptor");
+            return;
+         }
+         else
+         {
+            convertHttpToSoda(request, response);
+            LOGGER.info("normal exit");
+         }
+      }
+
+
+
+   protected void writeSodaDescriptor(HttpServletRequest request, HttpServletResponse response)
+         throws ServletException, IOException, UnsupportedEncodingException
+      {
+         PrintWriter writer = new PrintWriter(new OutputStreamWriter(response.getOutputStream(), RESPONSE_ENCODING));
+         response.setContentType("text/xml");
+         doSodaDescriptor(writer, request.getRequestURL().toString());
+         writer.close();
+      }
+
+
+   private Map<SodaParam, String[]> collectSodaParams(HttpServletRequest req)
+   {
+      Map<SodaParam, String[]> params = new HashMap<SodaParam, String[]>();
+      for(SodaParam paramToken : SodaParam.values())
+      {
+         String[] paramValue = req.getParameterValues(paramToken.toString());
+         params.put(paramToken, paramValue);
+      }
+      return params;
+   }
+
+
+
+   protected void convertHttpToSoda(HttpServletRequest request, HttpServletResponse response) 
+         throws ServletException, IOException, UnsupportedEncodingException
+
+      {
+         ServletOutputStream  respOutputStream = response.getOutputStream();
+
+         try
+         {
+            Map<SodaParam, String[]> params = collectSodaParams(request);
+            SodaParser parser = new SodaParser(params);
+
+            String id   = null;
+            Pos    pos  = null;
+            Band   band = null;
+            Time   time = null;
+            Pol    pol  = null;
+
+            if(parser.sodaReq_hasSodaId())
+            {
+               id   = parser.sodaReq_getId();
+               pos  = parser.sodaReq_getPosCirclePolygon();
+               band = parser.sodaReq_getBand();
+               time = parser.sodaReq_getTime();
+               pol  = parser.sodaReq_getPol();
+            }
+            else
+            {
+               id   = parser.vlkbReq_getPubdid();
+               pos  = parser.vlkbReq_getCircleRect();
+               band = parser.vlkbReq_getVelocity();
+            }
+
+            String respFormat = sodaReq_getResponseFormat(request, DEFAULT_RESPONSEFORMAT);
+
+            LOGGER.info("responseFormat: " + respFormat);
+
+            if(respFormat.equals("application/fits;createfile=yes"))
+            {
+               response.setContentType(respFormat);
+               doCutoutFileStream(id, pos, band, time, pol, respOutputStream);
+            }
+            else if(respFormat.equals("application/fits"))
+            {
+               response.setContentType(respFormat);
+               doCutoutStream(id, pos, band, time, pol, respOutputStream);
+            }
+            else if(respFormat.equals("application/x-vlkb+xml"))
+            {
+               boolean  countNullValues = vlkbReq_getNullValues(request);
+               response.setContentType(respFormat);
+
+               DataLink respDataLink = doCutoutFile(id, pos, band, time, pol, countNullValues, respFormat);
+
+               /* FIXME errors from engine not checked - cut-file might not have been created */
+               LOGGER.info("DataLink - id:" + respDataLink.id + " url: " + respDataLink.accessUrl );
+
+               final String respEncoding = "utf-8";
+               PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+               XmlSerializer.serializeToLegacyCutResults(writer, respEncoding, respDataLink, showDuration, startTime_msec);
+               writer.close(); /* NOTE must close to force flush to complete the xml */
+            }
+            else
+            {
+               throw new IllegalArgumentException("Unsupported RESPONSEFORMAT value : " + respFormat);
+            }
+
+         }
+         catch(MultiValuedParamNotSupported ex)
+         {
+            LOGGER.info("MultiValuedParamNotSupported: " + ex.getMessage());
+
+            response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
+            response.setContentType("text/plain");
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+
+            doMultiValuedParamNotSupported(ex.getMessage(), writer);
+
+            writer.close();
+         }
+         catch(IllegalArgumentException ex)
+         {
+            LOGGER.info("IllegalArgumentException: " + ex.getMessage());
+
+            response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
+            response.setContentType("text/plain");
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+
+            doUsageError(ex.getMessage(), writer);
+
+            writer.close();
+         }
+         catch(Exception ex)
+         {
+            LOGGER.info("Exception: " + ex.getMessage());
+            ex.printStackTrace();
+
+            response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
+            response.setContentType("text/plain");
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+
+            doError(ex.toString(), writer);
+
+            writer.close();
+         }
+         finally
+         {
+            respOutputStream.close();
+         }
+
+      }
+
+
+
+
+
+
+   /* SODA */
+
+
+   /* return null if value not present or the value if present exactly once
+    * else throw MultiplicityNotSupoorted SODA_error
+    */ 
+   private String soda_getSingleValue(HttpServletRequest req, String name)
+   {
+      String[] valArr = req.getParameterValues(name);
+
+      if(valArr == null)
+         return null;
+      else
+         if(valArr.length == 0)
+            return null;
+         else if(valArr.length == 1)
+            return valArr[0];
+         else
+            throw new IllegalArgumentException(
+                  "MultiValuedParamNotSupported: " + name + " was found " + valArr.length + " times");
+   }
+
+
+   private String sodaReq_getResponseFormat(HttpServletRequest req, String defaultResponseFormat)
+   {
+      String respFormat = soda_getSingleValue(req, "RESPONSEFORMAT");
+      return ((respFormat == null) ? defaultResponseFormat : respFormat);
+   }
+
+
+   private boolean vlkbReq_getNullValues(HttpServletRequest request)
+   {
+      return (null != soda_getSingleValue(request, "nullvals"));
+   }
+
+}
+
+
+
+/* from SODA (upon error):
+   Error codes are specified in DALI. Error documents should be text
+   using the text/plain content-type and the text must begin with one of the
+   following strings:
+
+   Error CodeDescription
+   ---------------------------------------
+Error: General error (not covered below)
+
+AuthenticationError: Not authenticated
+AuthorizationError: Not authorized to access the resource
+
+ServiceUnavailable: Transient error (could succeed with retry)
+UsageError: Permanent error (retry pointless)
+
+MultiValuedParamNotSupported: request included multiple values for a parameter
+but the service only supports a single value
+*/
+
+
+/* from DALI (upon successful request):
+   The service should set HTTP headers (Fielding and Gettys et al.,
+   1999) that are useful to the correct values where possible. Recommended
+   headers to set when possible:
+   Content-Type
+   Content-Encoding
+   Content-Length  -- not in SPDA-stream impossible to know
+   Last-Modified   -- not in SODA-stream impossible to know
+   */
diff --git a/data-access/servlet/src/main/java/webapi/ServletMCutout.java b/data-access/servlet/src/main/java/webapi/ServletMCutout.java
new file mode 100644
index 0000000000000000000000000000000000000000..fd88d8da960a85b8789dcc6824a79527dbbb234a
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/ServletMCutout.java
@@ -0,0 +1,136 @@
+
+import java.util.logging.Logger;
+
+import java.security.Principal;
+
+import javax.servlet.ServletException;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import java.io.OutputStreamWriter;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.BufferedReader;
+import java.io.PrintWriter;
+
+import java.util.Arrays;
+import java.util.List;
+import java.util.LinkedList;
+import java.util.Map;
+import java.util.Properties;
+
+// for Logging/Accounting
+import org.json.simple.JSONObject;
+import java.text.SimpleDateFormat;
+import java.util.Date;
+import java.util.TimeZone;
+
+// read HTTP body
+import java.util.stream.Collectors;
+import org.apache.commons.io.IOUtils;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
+import com.fasterxml.jackson.annotation.JsonProperty;
+
+
+public class ServletMCutout extends javax.servlet.http.HttpServlet
+{
+   private static final Logger   LOGGER   = Logger.getLogger(ServletMCutout.class.getName());
+   private static final Settings settings = Settings.getInstance();
+
+   protected Datasets datasets = new DatasetsImpl(settings);
+
+
+
+   public void init() throws ServletException
+   {
+      super.init();
+
+      LOGGER.info("FITS : " + settings.fitsPaths.toString());
+      LOGGER.info("AMQP : " + settings.amqpConn.toString());
+      LOGGER.info("DB   : " + settings.dbConn.toString());
+
+   }
+
+
+   /* DALI allows GET and POST for sync services */
+
+   protected void doGet(HttpServletRequest request, HttpServletResponse response)
+       throws ServletException, IOException
+   {
+      processRequest(request, response);
+   }
+
+   protected void doPost(HttpServletRequest request, HttpServletResponse response)
+      throws ServletException, IOException
+   {
+      processRequest(request, response);
+   }
+
+
+
+   protected void processRequest(HttpServletRequest request, HttpServletResponse response)
+      throws ServletException, IOException
+   {
+            long startTime_msec = System.currentTimeMillis();
+      boolean showDuration = true;
+
+      InputStreamReader isr                  = null;
+
+      response.setContentType("text/xml");
+      try
+      {
+         isr    = new InputStreamReader(request.getInputStream(),"utf-8");
+
+
+         BufferedReader input = new BufferedReader(isr);
+         StringBuffer jsonStringBuffer = new StringBuffer();
+         String line;
+         while((line = input.readLine()) != null)
+         {
+            jsonStringBuffer.append(line);
+         }
+         String reqJsonString = jsonStringBuffer.toString();
+
+         OutputStream respOutputStream = response.getOutputStream();
+
+         DataLink dlk = datasets.doMCutout(reqJsonString);
+
+         final String contentType = "text/xml"; // FIXME
+         final String respEncoding = "utf-8"; // FIXME
+         if(contentType.equals("text/xml") || contentType.equals("application/xml"))
+         {
+            //LOGGER.info("writing xml");
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, respEncoding));
+            XmlSerializer.serializeToLegacyCutResults(writer, respEncoding, dlk, showDuration, startTime_msec);
+            writer.close();
+         }
+         else if(contentType.equals("application/tar.gz")) // FIXME mime for tgz ?
+         {
+            //LOGGER.info("streaming the file NOT IMPLEMENTED yet for mcutout");
+            /*  
+                File downloadFile = new File(cod.absCutPathname);
+                FileInputStream input = new BuffereInputStream(new FileInputStream(downloadFile));
+                input.transferTo(respOutputStream);
+                LOGGER.info("Deleting after download: " + downloadFile.getName() );
+                downloadFile.delete();
+                */
+         }
+         else
+         {
+            throw new AssertionError("Unsupported contentType for output: " + contentType);
+         }
+
+
+      }
+      catch(IOException ex)
+      {
+         /* FIXME find better exception */
+         throw new AssertionError("internal error: jsonFile.open() throws IOException:" + ex.getMessage());
+      }
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/ServletMerge.java b/data-access/servlet/src/main/java/webapi/ServletMerge.java
new file mode 100644
index 0000000000000000000000000000000000000000..2415e543489c224949535e75f80b527082857195
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/ServletMerge.java
@@ -0,0 +1,210 @@
+
+import java.util.logging.Logger;
+
+import java.io.IOException;
+import java.io.PrintWriter;
+import java.io.File;
+import java.io.OutputStream;
+import java.util.Enumeration;
+import java.util.*; // ArrayList<String>
+
+import javax.servlet.ServletConfig;
+import javax.servlet.ServletException;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+import javax.servlet.ServletOutputStream; // for SOOA
+
+// from vlkb_mergefiles.java - dir & file handling
+import java.io.*;
+
+import java.nio.file.*;
+import static java.nio.file.StandardCopyOption.*;
+
+//import nom.tam.fits.*;// Fits - for regridding -- merge only
+
+// config file
+import java.util.Properties;
+
+
+// for Logging/Accounting
+import org.json.simple.JSONObject;
+import java.text.SimpleDateFormat;
+import java.util.Date;
+import java.util.TimeZone;
+
+import java.security.Principal;
+
+
+
+public class ServletMerge extends javax.servlet.http.HttpServlet
+{
+   private static final Logger   LOGGER   = Logger.getLogger(ServletMerge.class.getName());
+   private static final Settings settings = Settings.getInstance();
+
+   final String RESPONSE_ENCODING = "utf-8";
+   final String DEFAULT_RESPONSEFORMAT = settings.defaults.responseFormat;
+   final String DEFAULT_SKY_SYSTEM     = settings.defaults.skySystem;
+   final String DEFAULT_SPEC_SYSTEM    = settings.defaults.specSystem;
+
+
+   Datasets datasets = new DatasetsImpl(settings);
+
+
+
+   public void init() throws ServletException
+   {
+      super.init();
+
+      LOGGER.info("FITS : " + settings.fitsPaths.toString());
+      LOGGER.info("AMQP : " + settings.amqpConn.toString());
+      LOGGER.info("DB   : " + settings.dbConn.toString());
+   }
+
+
+   /* DALI allows GET and POST for sync services */
+
+   protected void doGet(HttpServletRequest request, HttpServletResponse response)
+         throws ServletException, IOException
+      {
+         processRequest(request, response);
+      }
+
+   protected void doPost(HttpServletRequest request, HttpServletResponse response)
+         throws ServletException, IOException
+      {
+         processRequest(request, response);
+      }
+
+
+
+   protected void processRequest(HttpServletRequest request, HttpServletResponse response)
+         throws ServletException, IOException
+      {
+         long startTime_msec = System.currentTimeMillis();
+         boolean showDuration = true;
+
+         ServletOutputStream  respOutputStream = response.getOutputStream();
+
+         try
+         {
+
+            Map<SodaParam, String[]> params = collectSodaParams(request);
+            SodaParser parser = new SodaParser(params);
+
+            String id   = null;
+            Pos    pos  = null;
+            Band   band = null;
+            Time   time = null;
+            Pol    pol  = null;
+
+            if(parser.sodaReq_hasSodaId())
+            {
+               id   = parser.sodaReq_getId();
+               pos  = parser.sodaReq_getPosCirclePolygon();
+               band = parser.sodaReq_getBand();
+               time = parser.sodaReq_getTime();
+               pol  = parser.sodaReq_getPol();
+            }
+            else
+            {
+               id   = parser.vlkbReq_getPubdid();
+               pos  = parser.vlkbReq_getCircleRect();
+               band = parser.vlkbReq_getVelocity();
+            }
+
+            String respFormat = DEFAULT_RESPONSEFORMAT;//sodaReq_getResponseFormat(request, DEFAULT_RESPONSEFORMAT);
+
+            Coord coord = new Coord(DEFAULT_SKY_SYSTEM, pos, DEFAULT_SPEC_SYSTEM, band, time, pol);
+
+            // FIXME should parse from params
+            boolean countNullValues = false;
+            String respContentType = DEFAULT_RESPONSEFORMAT;
+
+            response.setContentType(respContentType);
+
+            CutResult cutResult = datasets.doMerge(parseLegacyPubdidArr(id), coord, countNullValues);
+
+            DataLink dlk = new DataLink(cutResult);
+
+            String contentType = respContentType;
+            String respEncoding = RESPONSE_ENCODING;
+
+            if(contentType.equals("text/xml") || contentType.equals("application/xml"))
+            {
+               LOGGER.info("writing xml");
+               PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, respEncoding));
+               XmlSerializer.serializeToLegacyCutResults(writer, respEncoding, dlk, showDuration, startTime_msec);
+               writer.close();
+            }
+            else if(contentType.equals("application/fits"))
+            {
+               LOGGER.info("streaming the file");
+
+               File downloadFile = new File(dlk.absCutPathname);
+               FileInputStream input = new FileInputStream(downloadFile);
+
+               input.transferTo(respOutputStream);
+
+               LOGGER.info("Deleting after download: " + downloadFile.getName() );
+               downloadFile.delete();
+            }
+            else
+            {
+               throw new AssertionError("Unsupported contentType for output: " + contentType);
+            }
+
+            respOutputStream.close();
+         }
+         catch(IllegalArgumentException ex)
+         {
+            LOGGER.info("Illegal arg : " + ex.getMessage());
+
+            response.setStatus(HttpServletResponse.SC_BAD_REQUEST);
+            response.setContentType("text/plain");
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+            writer.println("UsageError : " + ex.getMessage());
+            writer.close();
+         }
+         catch(Exception ex)
+         {
+            ex.printStackTrace();
+
+            response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
+            response.setContentType("text/plain");
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+            writer.println("Error : " + ex.getMessage());
+            writer.close();
+         }
+
+         LOGGER.info("processRequest normal exit");
+      }
+
+
+   private Map<SodaParam, String[]> collectSodaParams(HttpServletRequest req)
+   {
+      Map<SodaParam, String[]> params = new HashMap<SodaParam, String[]>();
+      for(SodaParam paramToken : SodaParam.values())
+      {
+         String[] paramValue = req.getParameterValues(paramToken.toString());
+         params.put(paramToken, paramValue);
+      }
+      return params;
+   }
+
+
+   /* semi-colon separated list of pudids convert to arra */
+   private String[] parseLegacyPubdidArr(String pubdids)
+   {   
+      List<String> pubdidList = new ArrayList<String>();
+      String[] pdArr = pubdids.split(";");
+      for(String pd : pdArr)
+         if(pd.length() > 0) pubdidList.add(pd);
+
+      String[] pubdidArr = new String[pubdidList.size()];
+
+      return pubdidList.toArray(pubdidArr);
+   }   
+
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/Settings.java b/data-access/servlet/src/main/java/webapi/Settings.java
new file mode 100644
index 0000000000000000000000000000000000000000..233bfe3882e3580257f9c2c5d433a9302d6514c0
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/Settings.java
@@ -0,0 +1,173 @@
+
+import java.util.logging.Logger;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Properties;
+import java.io.PrintWriter;
+
+
+class Settings
+{
+   private static final Logger LOGGER = Logger.getLogger("Settings");
+
+   static final String CUTOUT_PROPERTIES = "cutout.properties";
+
+
+   public static class FITSPaths
+   {
+      private String FITSpath;
+      private String FITScutpath;
+      private String FITSRemoteUrlCutouts;
+      private String surveysMetadataAbsPathname;
+
+      public String surveys() {return FITSpath;};
+      public String cutouts() {return FITScutpath;};
+      public String cutoutsUrl() {return FITSRemoteUrlCutouts;};
+      public String surveysMetadataAbsPathname() {return surveysMetadataAbsPathname;};
+
+      public String toString()
+      {
+         return FITSpath + " " + FITScutpath + " " + FITSRemoteUrlCutouts + " " + surveysMetadataAbsPathname;
+      }
+   }
+
+
+   public static class DBConn
+   {
+      private String uri;
+      private String schema;
+      private String user_name;
+      private String password;
+
+      public String toString()
+      {
+         return uri() + " [" + schema + "] " + user_name + " / " + password  + " ";
+      }
+
+      public String uri() { return uri; }
+      public String schema() { return schema; }
+      public String userName() { return user_name; }
+      public String password() { return password; }
+   }
+
+
+   public static class DefaultParamValues
+   {
+      String responseFormat;
+      String skySystem;
+      String specSystem;
+      boolean showDuration;
+   }
+
+
+   public static class AMQPConn
+   {
+      private String hostName;
+      private int    portNum;
+      private String routingKey;
+
+      public String hostName()   { return hostName; }
+      public int    portNumber() { return portNum; }
+      public String routingKey() { return routingKey; }
+
+      public String toString()
+      {
+         return hostName + " " + String.valueOf(portNum) + " " + routingKey;
+      }
+   }
+
+   public FITSPaths  fitsPaths;
+   public DBConn     dbConn;
+   public AMQPConn   amqpConn;
+   public DefaultParamValues defaults;
+
+
+   // will not start without config-file
+   // no reasonable code-defaults can be invented
+   public static Settings getInstance()
+   {
+      try
+      {
+         InputStream ins =
+            Settings.class.getClassLoader().getResourceAsStream(CUTOUT_PROPERTIES);
+
+         if (ins != null)
+         {
+            Properties properties = new Properties();
+            properties.load(ins);
+
+            FITSPaths fitsPaths = loadFITSPaths(properties);
+            DBConn    dbConn    = loadDBConn(properties);
+            AMQPConn  amqpConn  = loadAMQPConn(properties);
+            DefaultParamValues defaults = loadDefaults(properties);
+
+            return new Settings(dbConn, amqpConn, fitsPaths, defaults);
+         }
+         else
+         {
+            throw new IllegalStateException(CUTOUT_PROPERTIES + " not found in classpath");
+         }
+
+      }
+      catch(IOException ex)
+      {
+         throw new IllegalStateException("Error while loading " + CUTOUT_PROPERTIES + " file", ex);
+      }
+   }
+
+
+
+   private Settings(DBConn dbConn, AMQPConn amqpConn,
+         FITSPaths fitsPaths, DefaultParamValues defaults)
+   {
+      this.fitsPaths = fitsPaths;
+      this.dbConn    = dbConn;
+      this.amqpConn  = amqpConn;
+      this.defaults  = defaults;
+   }
+
+
+
+   private static FITSPaths loadFITSPaths(Properties properties)
+   {
+      FITSPaths fitspaths = new FITSPaths();
+      fitspaths.FITSpath             = properties.getProperty("fits_path_surveys", "/srv/surveys").strip();
+      fitspaths.FITScutpath          = properties.getProperty("fits_path_cutouts", "/srv/cutouts").strip();
+      fitspaths.FITSRemoteUrlCutouts = properties.getProperty("fits_url_cutouts",  "").strip();
+      fitspaths.surveysMetadataAbsPathname = properties.getProperty("surveys_metadata_abs_pathname", "/srv/surveys/survey_populate.csv").strip();
+      return fitspaths;
+   }
+
+   private static DBConn loadDBConn(Properties properties)
+   {
+      DBConn dbconn = new Settings.DBConn();
+      dbconn.uri       = properties.getProperty("db_uri", "").strip();
+      dbconn.schema    = properties.getProperty("db_schema", "").strip();
+      dbconn.user_name = properties.getProperty("db_user_name", "").strip();
+      dbconn.password  = properties.getProperty("db_password", "").strip();
+      return dbconn;
+   }
+
+   private static DefaultParamValues loadDefaults(Properties properties)
+   {
+      DefaultParamValues defaults = new DefaultParamValues();
+      defaults.responseFormat = properties.getProperty("default_response_format", "application/fits").strip();
+      defaults.skySystem      = properties.getProperty("default_sky_system", "ICRS").strip();
+      defaults.specSystem     = properties.getProperty("default_spec_system", "WAVE_Barycentric").strip();
+      defaults.showDuration   = "yes".equals(properties.getProperty("show_duration", "no").strip());
+      return defaults;
+   }
+
+   private static AMQPConn loadAMQPConn(Properties properties)
+   {
+      AMQPConn amqpconn = new AMQPConn();
+      amqpconn.hostName   = properties.getProperty("amqp_host_name", "localhost").strip();
+      String strPortNum   = properties.getProperty("amqp_port", "5672").strip();
+      amqpconn.portNum    = Integer.parseInt(strPortNum);
+      amqpconn.routingKey = properties.getProperty("amqp_routing_key", "").strip();
+      return amqpconn;
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/UWSMCutout.java b/data-access/servlet/src/main/java/webapi/UWSMCutout.java
new file mode 100644
index 0000000000000000000000000000000000000000..b4565c0a8a823fdec317bf80239ebfca0601ce7a
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/UWSMCutout.java
@@ -0,0 +1,67 @@
+import java.io.IOException;
+import java.io.PrintWriter;
+
+import javax.servlet.ServletException;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+
+import uws.UWSException;
+import uws.job.ErrorType;
+import uws.job.JobList;
+import uws.job.JobThread;
+import uws.job.UWSJob;
+import uws.job.parameters.InputParamController;
+import uws.job.parameters.NumericParamController;
+import uws.job.parameters.StringParamController;
+import uws.job.user.JobOwner;
+import uws.service.UWSServlet;
+import uws.service.UWSUrl;
+
+public class UWSMCutout extends UWSServlet {
+   private static final long serialVersionUID = 1L;
+
+   public static final Settings settings = Settings.getInstance();
+
+   /* REQUIRED
+    * Initialize your UWS. At least, you should create one jobs list. */
+   @Override
+   public void initUWS() throws UWSException
+   {
+      addJobList(new JobList("mcutout"));
+      UWSMCutoutUserIdentifier uwsUserIdentifier = new UWSMCutoutUserIdentifier();
+      setUserIdentifier(uwsUserIdentifier);
+   }
+
+   /*
+    * REQUIRED
+    * Create instances of jobs, but only the "work" part. The "work" and the description of the job (and all the provided parameters)
+    * are now separated and only kept in the UWSJob given in parameter. This one is created automatically by the API.
+    * You just have to provide the "work" part.
+    */
+   @Override
+   public JobThread createJobThread(UWSJob job) throws UWSException{
+      if (job.getJobList().getName().equals("mcutout")) // FIXME listanme to Config-file ?
+         return new UWSMCutoutWork(job);
+      else
+         throw new UWSException("Impossible to create a job inside the jobs list \"" + job.getJobList().getName() + "\" !");
+   }
+
+   /* OPTIONAL
+    * By overriding this function, you can customize the root of your UWS.
+    * If this function is not overridden an XML document which lists all registered jobs lists is returned. */
+   @Override
+   protected void writeHomePage(UWSUrl requestUrl, HttpServletRequest req, HttpServletResponse resp, JobOwner user) throws UWSException, ServletException, IOException{
+      PrintWriter out = resp.getWriter();
+
+      out.println("<html><head><title>VLKB Multi-Cutout service</title></head><body>");
+      out.println("<h1>VLKB MCutout service</h1");
+      out.println("<p>Available job lists:</p>");
+
+      out.println("<ul>");
+      for(JobList jl : this){
+         out.println("<li>" + jl.getName() + " - " + jl.getNbJobs() + " jobs - <a href=\"" + requestUrl.listJobs(jl.getName()) + "\">" + requestUrl.listJobs(jl.getName()) + "</a></li>");
+      }
+      out.println("</ul>");
+   }
+
+}
diff --git a/data-access/servlet/src/main/java/webapi/UWSMCutoutJobOwner.java b/data-access/servlet/src/main/java/webapi/UWSMCutoutJobOwner.java
new file mode 100644
index 0000000000000000000000000000000000000000..2508347a81b2aca20df224b984037f15f6adb391
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/UWSMCutoutJobOwner.java
@@ -0,0 +1,167 @@
+/*
+ *  This was taken from UWSLibrary DefaultJobOwner because that is final class
+ *  we need to add only 
+ */
+
+import java.util.HashMap;
+import java.util.Map;
+import java.util.Set;
+
+import uws.job.JobList;
+import uws.job.UWSJob;
+import uws.job.user.JobOwner;
+
+
+
+public final class UWSMCutoutJobOwner implements JobOwner {
+
+   private AuthPolicy auth;
+
+   private final String id;
+   private String pseudo;
+   private HashMap<String,Object> otherData = null;
+
+
+   public UWSMCutoutJobOwner(final AuthPolicy auth){
+      this.auth = auth;
+      this.id = auth.getUserName();
+      this.pseudo = this.id;
+   }
+
+   public UWSMCutoutJobOwner(final String name){
+      this(name, name);
+   }
+
+   public UWSMCutoutJobOwner(final String id, final String pseudo){
+      this.id = id;
+      this.pseudo = pseudo;
+   }
+
+   public String[] getGroups(){ return auth.getUserGroups();}
+
+   @Override
+   public final String getID(){
+      return id;
+   }
+
+   @Override
+   public final String getPseudo(){
+      return pseudo;
+   }
+
+   public final void setPseudo(final String pseudo){
+      this.pseudo = pseudo;
+   }
+
+   /**
+    *          * By default: ALL users have the READ permission for ALL jobs lists.
+    *                   * @see uws.job.user.JobOwner#hasReadPermission(uws.job.JobList)
+    *                            */
+   @Override
+   public boolean hasReadPermission(JobList jl){
+      return true;
+   }
+
+   /**
+    *          * By default: ALL users have the WRITE permission for ALL jobs lists.
+    *                   * @see uws.job.user.JobOwner#hasWritePermission(uws.job.JobList)
+    *                            */
+   @Override
+   public boolean hasWritePermission(JobList jl){
+      return true;
+   }
+
+   /**
+    *          * By default: ONLY owners of the given job have the READ permission.
+    *                   * @see uws.job.user.JobOwner#hasReadPermission(uws.job.UWSJob)
+    *                            */
+   @Override
+   public boolean hasReadPermission(UWSJob job){
+      return (job == null) || (job.getOwner() == null) || (job.getOwner().equals(this));
+   }
+
+   /**
+    *          * By default: ONLY owners of the given job have the WRITE permission.
+    *                   * @see uws.job.user.JobOwner#hasWritePermission(uws.job.UWSJob)
+    *                            */
+   @Override
+   public boolean hasWritePermission(UWSJob job){
+      return (job == null) || (job.getOwner() == null) || (job.getOwner().equals(this));
+   }
+
+   /**
+    *          * By default: ONLY owners of the given job have the EXECUTE permission.
+    *                   * @see uws.job.user.JobOwner#hasExecutePermission(uws.job.UWSJob)
+    *                            */
+   @Override
+   public boolean hasExecutePermission(UWSJob job){
+      return (job == null) || (job.getOwner() == null) || (job.getOwner().equals(this));
+   }
+
+   public String putUserData(final String name, final String value){
+      if (otherData == null)
+         otherData = new HashMap<String,Object>();
+      return (String)otherData.put(name, value);
+   }
+
+   public String getUserData(final String name){
+      return (otherData == null) ? null : (String)otherData.get(name);
+   }
+
+   public String removeUserData(final String name){
+      return (otherData == null) ? null : (String)otherData.remove(name);
+   }
+
+   public Set<String> getAllUserData(){
+      return (otherData == null) ? null : otherData.keySet();
+   }
+
+   @Override
+   public Map<String,Object> getDataToSave(){
+      return otherData;
+   }
+
+   @Override
+   public void restoreData(Map<String,Object> data){
+      if (data == null || data.isEmpty())
+         return;
+
+      if (otherData == null)
+         otherData = new HashMap<String,Object>(data.size());
+
+      otherData.putAll(data);
+   }
+
+   /**
+    *          * By default: the user ID.
+    *                   * @see java.lang.Object#toString()
+    *                            */
+   @Override
+   public String toString(){
+      return id;
+   }
+
+   /**
+    *          * By default: a {@link UWSMCutoutJobOwner} is equal to any {@link JobOwner} only if their ID are equals.
+    *                   * @see java.lang.Object#equals(java.lang.Object)
+    *                            */
+   @Override
+   public boolean equals(Object obj){
+      if (obj == null || !(obj instanceof JobOwner))
+         return false;
+
+      String objId = ((JobOwner)obj).getID();
+      return (id == null && objId == null) || (id != null && objId != null && id.equals(objId));
+   }
+
+   /**
+    *          * By default: this function returns the hashCode of the ID.
+    *                   * @see java.lang.Object#hashCode()
+    *                            */
+   @Override
+   public int hashCode(){
+      return id.hashCode();
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/UWSMCutoutUserIdentifier.java b/data-access/servlet/src/main/java/webapi/UWSMCutoutUserIdentifier.java
new file mode 100644
index 0000000000000000000000000000000000000000..ae56f5b60b21c886ffce741554a27d43a484f098
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/UWSMCutoutUserIdentifier.java
@@ -0,0 +1,54 @@
+
+import java.util.Map;
+
+import javax.servlet.http.HttpServletRequest;
+
+import uws.UWSException;
+import uws.job.user.JobOwner;
+import uws.job.user.DefaultJobOwner;
+import uws.service.UserIdentifier;
+import uws.service.UWSUrl;
+
+import org.json.JSONArray;
+import org.json.JSONObject;
+
+public class UWSMCutoutUserIdentifier implements UserIdentifier
+{
+   // returns null if no security configured
+   // throws exception if UserPrincipal not of expected type
+   // returns AuthPolicy  (UserName and UserGroups) otherwise
+   public JobOwner extractUserId(UWSUrl urlInterpreter, HttpServletRequest request) throws UWSException
+   {
+      // if security not configured, there is no job-owner
+
+      if(request.getUserPrincipal() == null)
+         return null;
+
+      // if security configured, try to get user
+
+      AuthPolicy auth = null; 
+      try
+      {
+         // we need to exec the check inside AuthPolicy-constructor here
+         // because this is called before every UWS-action and JobWork only at PHASE=RUN
+         // Otherwise we could only pass UserPrincipal and use AuthPolicy in JobWork
+
+         auth = new AuthPolicy(request.getUserPrincipal());
+      }
+      catch(IllegalArgumentException iae)
+      {
+         // UserPrincipal of not expected type -> access not allowed on this system
+         throw new UWSException(UWSException.BAD_REQUEST, iae);
+      }
+
+      UWSMCutoutJobOwner jobOwner = new UWSMCutoutJobOwner(auth);
+
+      return jobOwner;
+   }
+
+   public JobOwner restoreUser(final String id, final String pseudo, final Map<String,Object> otherData) throws UWSException
+   {
+      return new DefaultJobOwner(id,pseudo);
+   }
+
+}
diff --git a/data-access/servlet/src/main/java/webapi/UWSMCutoutWork.java b/data-access/servlet/src/main/java/webapi/UWSMCutoutWork.java
new file mode 100644
index 0000000000000000000000000000000000000000..259cecb8ab8f6577c58e07452c39e5119a6b4c72
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/UWSMCutoutWork.java
@@ -0,0 +1,111 @@
+import java.io.File;
+import java.io.PrintWriter;
+import java.io.OutputStream;
+import java.io.BufferedOutputStream;
+import java.io.OutputStreamWriter;
+import java.io.IOException;
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStreamReader;
+
+import uws.UWSException;
+import uws.job.ErrorType;
+import uws.job.JobThread;
+import uws.job.user.JobOwner;
+import uws.job.user.DefaultJobOwner;
+import uws.job.Result;
+import uws.job.UWSJob;
+import uws.job.jobInfo.JobInfo;
+import uws.job.jobInfo.XMLJobInfo;
+import uws.job.jobInfo.SingleValueJobInfo;
+
+import uws.service.request.UploadFile;
+import uws.service.file.LocalUWSFileManager;
+import uws.service.file.UWSFileManager;
+import uws.service.UWSUrl;
+
+// rbu
+import java.util.*; 
+import org.json.simple.JSONArray;
+import org.json.simple.JSONObject;
+import org.json.simple.parser.JSONParser;
+import org.json.simple.parser.ParseException;
+
+
+
+public class UWSMCutoutWork extends JobThread
+{
+   final String RESP_ENCODING = "utf-8";
+
+   private Settings settings = UWSMCutout.settings;
+
+   protected Datasets datasets = new DatasetsImpl(settings);
+
+   /* NOTE needed if cutouts dir served by vlkb-datasets */
+   private String webappRootRequestUrl = null;
+
+   public UWSMCutoutWork(UWSJob j) throws UWSException{
+      super(j);
+     UWSUrl url = j.getUrl();
+     webappRootRequestUrl = url.getUrlHeader();
+   }
+
+
+
+   @Override
+   protected void jobWork() throws UWSException, InterruptedException
+   {
+      try
+      {
+         long startTime_msec = System.currentTimeMillis();
+         boolean showDuration = true;
+
+         /* UWS -> SODA (JDL in POST body is part of SODA REC) */
+
+         UploadFile jsonFile = (UploadFile)job.getAdditionalParameterValue("mcutout");
+         final String contentType = "text/xml"; // FIXME should be input param ? RESPONSEFORMAT ?
+
+         Result result = createResult("Report");
+         result.setMimeType("text/xml");
+         OutputStream respOutputStream = getResultOutput(result);
+
+         if(contentType.equals("text/xml") || contentType.equals("application/xml"))
+         {
+            InputStreamReader isr = new InputStreamReader(jsonFile.open());
+            BufferedReader input = new BufferedReader(isr);
+            StringBuffer jsonStringBuffer = new StringBuffer();
+            String line;
+            while((line = input.readLine()) != null)
+            {
+               jsonStringBuffer.append(line);
+            }
+            String reqJsonString = jsonStringBuffer.toString();
+
+            /* SODA -> Implementation */
+
+            DataLink dlk = datasets.doMCutout(reqJsonString);
+
+            /* Implement -> SODA */
+
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESP_ENCODING));
+            XmlSerializer.serializeToLegacyCutResults(writer, RESP_ENCODING, dlk, showDuration, startTime_msec);
+            writer.close();
+
+            /* SODA -> UWS */
+         }
+         else
+         {
+            throw new AssertionError("Unsupported contentType for output: " + contentType);
+         }
+
+         /* FIXME here was uws-check is-job-Interrupted */
+
+         publishResult(result);
+      }
+      catch(IOException ex)
+      {
+         throw new UWSException("Internal error: jsonFile.open() throws IOException:" + ex.getMessage());
+      }
+   }
+
+}
diff --git a/data-access/servlet/src/main/java/webapi/UWSMerge.java b/data-access/servlet/src/main/java/webapi/UWSMerge.java
new file mode 100644
index 0000000000000000000000000000000000000000..dea1ef6c2db1d1c7dff5aac74472e4802738dcbc
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/UWSMerge.java
@@ -0,0 +1,91 @@
+import java.io.IOException;
+import java.io.PrintWriter;
+
+import javax.servlet.ServletException;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+
+import uws.UWSException;
+import uws.job.ErrorType;
+import uws.job.JobList;
+import uws.job.JobThread;
+import uws.job.UWSJob;
+import uws.job.parameters.InputParamController;
+import uws.job.parameters.NumericParamController;
+import uws.job.parameters.StringParamController;
+import uws.job.user.JobOwner;
+import uws.service.UWSServlet;
+import uws.service.UWSUrl;
+
+public class UWSMerge extends UWSServlet {
+   private static final long serialVersionUID = 1L;
+
+   public static final Settings settings = Settings.getInstance();
+
+   /* REQUIRED
+    * Initialize your UWS. At least, you should create one jobs list. */
+   @Override
+   public void initUWS() throws UWSException{
+      addJobList(new JobList("merges"));
+
+      addExpectedAdditionalParameter("surveyname");
+      addExpectedAdditionalParameter("species");
+      addExpectedAdditionalParameter("transition");
+
+      addExpectedAdditionalParameter("pubdid");
+      //setInputParamController("pubdid", new StringParamController("pubdid"));
+
+      addExpectedAdditionalParameter("l");
+      addExpectedAdditionalParameter("b");
+      addExpectedAdditionalParameter("r");
+      addExpectedAdditionalParameter("dl");
+      addExpectedAdditionalParameter("db");
+
+      addExpectedAdditionalParameter("vl");
+      addExpectedAdditionalParameter("vu");
+      addExpectedAdditionalParameter("vt");
+
+      setInputParamController("l", new NumericParamController());
+      setInputParamController("b", new NumericParamController());
+      setInputParamController("r", new NumericParamController());
+      setInputParamController("dl", new NumericParamController());
+      setInputParamController("db", new NumericParamController());
+      setInputParamController("vu", new NumericParamController());
+      setInputParamController("vl", new NumericParamController());
+      setInputParamController("vt", new StringParamController("1", "1", new String[]{"1","2"}, false));
+      // FIXME replace "1" "2" with proper spectral axis names
+   }
+
+   /*
+    * REQUIRED
+    * Create instances of jobs, but only the "work" part. The "work" and the description of the job (and all the provided parameters)
+    * are now separated and only kept in the UWSJob given in parameter. This one is created automatically by the API.
+    * You just have to provide the "work" part.
+    */
+   @Override
+   public JobThread createJobThread(UWSJob job) throws UWSException{
+      if (job.getJobList().getName().equals("merges"))
+         return new UWSMergeWork(job);
+      else
+         throw new UWSException("Impossible to create a job inside the jobs list \"" + job.getJobList().getName() + "\" !");
+   }
+
+   /* OPTIONAL
+    * By overriding this function, you can customize the root of your UWS.
+    * If this function is not overridden an XML document which lists all registered jobs lists is returned. */
+   @Override
+   protected void writeHomePage(UWSUrl requestUrl, HttpServletRequest req, HttpServletResponse resp, JobOwner user) throws UWSException, ServletException, IOException{
+      PrintWriter out = resp.getWriter();
+
+      out.println("<html><head><title>VLKB Merge service (by UWS version 4)</title></head><body>");
+      out.println("<h1>VLKB Merge service</h1");
+      out.println("<p>Available job lists:</p>");
+
+      out.println("<ul>");
+      for(JobList jl : this){
+         out.println("<li>" + jl.getName() + " - " + jl.getNbJobs() + " jobs - <a href=\"" + requestUrl.listJobs(jl.getName()) + "\">" + requestUrl.listJobs(jl.getName()) + "</a></li>");
+      }
+      out.println("</ul>");
+   }
+
+}
diff --git a/data-access/servlet/src/main/java/webapi/UWSMergeWork.java b/data-access/servlet/src/main/java/webapi/UWSMergeWork.java
new file mode 100644
index 0000000000000000000000000000000000000000..3e1ab78d1490d900e4609bfb47f3f2f1a77a208a
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/UWSMergeWork.java
@@ -0,0 +1,158 @@
+import java.io.PrintWriter;
+import java.io.OutputStream;
+import java.io.OutputStreamWriter;
+import java.io.BufferedOutputStream;
+import javax.servlet.ServletOutputStream;
+import java.io.IOException;
+import java.io.FileNotFoundException;
+import java.io.File;
+import java.io.FileInputStream;
+
+import uws.UWSException;
+import uws.job.ErrorType;
+import uws.job.JobThread;
+import uws.job.Result;
+import uws.job.UWSJob;
+import uws.service.UWSUrl;
+
+/* for datasets::doAction */
+import java.security.Principal;
+import java.util.Map;
+import java.util.HashMap;
+import java.util.Set;
+import java.util.List;
+import java.util.ArrayList;
+
+public class UWSMergeWork extends JobThread
+{
+   private Settings settings = UWSMerge.settings;
+
+   final String RESPONSE_ENCODING = "utf-8";
+   final String DEFAULT_RESPONSEFORMAT = settings.defaults.responseFormat;
+   final String DEFAULT_SKY_SYSTEM     = settings.defaults.skySystem;
+   final String DEFAULT_SPEC_SYSTEM    = settings.defaults.specSystem;
+
+
+   Datasets datasets = new DatasetsImpl(settings);
+
+   /* NOTE needed if cutouts dir served by vlkb-datasets */
+   private String webappRootRequestUrl = null;
+
+   public UWSMergeWork(UWSJob j) throws UWSException
+   {
+      super(j);
+      UWSUrl url = j.getUrl();
+      webappRootRequestUrl = url.getUrlHeader();
+   }
+
+
+   /* FIXME in UWS howto result.setSize(size); */
+
+   @Override
+   protected void jobWork() throws UWSException, InterruptedException
+   {
+      try
+      {
+         long startTime_msec = System.currentTimeMillis();
+         boolean showDuration = true;
+
+         Map<SodaParam, String[]> params = collectSodaParams(job);
+         SodaParser parser = new SodaParser(params);
+
+         String id   = null;
+         Pos    pos  = null;
+         Band   band = null;
+         Time   time = null;
+         Pol    pol  = null;
+
+         if(parser.sodaReq_hasSodaId())
+         {
+            id   = parser.sodaReq_getId();
+            pos  = parser.sodaReq_getPosCirclePolygon();
+            band = parser.sodaReq_getBand();
+            time = parser.sodaReq_getTime();
+            pol  = parser.sodaReq_getPol();
+         }
+         else
+         {
+            id   = parser.vlkbReq_getPubdid();
+            pos  = parser.vlkbReq_getCircleRect();
+            band = parser.vlkbReq_getVelocity();
+         }
+
+         Coord coord = new Coord(DEFAULT_SKY_SYSTEM, pos, DEFAULT_SPEC_SYSTEM, band, time, pol);
+
+         CutResult cutResult  = datasets.doMerge(parseLegacyPubdidArr(id), coord, false);//countNullValues);
+         DataLink dlk = new DataLink(cutResult);
+
+         final String respFormat = "text/xml";// FIXME read from param RESPONSEFORMAT ?
+
+
+         String contentType = respFormat;
+         String respEncoding = RESPONSE_ENCODING;
+         Result result = createResult("Report");
+         result.setMimeType(respFormat);
+         OutputStream respOutputStream = getResultOutput(result);
+
+         if(contentType.equals("text/xml") || contentType.equals("application/xml"))
+         {
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, respEncoding));
+            XmlSerializer.serializeToLegacyCutResults(writer, respEncoding, dlk, showDuration, startTime_msec);
+            writer.close();
+         }
+         else if(contentType.equals("application/fits"))
+         {
+            File downloadFile = new File(dlk.absCutPathname);
+            FileInputStream input = new FileInputStream(downloadFile);
+            input.transferTo(respOutputStream);
+            downloadFile.delete();
+         }
+         else
+         {
+            throw new AssertionError("Unsupported contentType for output: " + contentType);
+         }
+
+         /* publishResult(result);*/
+         respOutputStream.close();
+      }
+      catch(IllegalArgumentException ex)
+      {
+         throw new UWSException(UWSException.BAD_REQUEST, ex.getMessage());
+      }
+      catch(FileNotFoundException ex)
+      {
+         throw new UWSException(UWSException.BAD_REQUEST, ex.getMessage());
+      }
+      catch(IOException ex)
+      {
+         throw new UWSException(UWSException.BAD_REQUEST, ex.getMessage());
+      }
+   }
+
+
+   /* semi-colon separated list of pudids convert to arra */
+   private String[] parseLegacyPubdidArr(String pubdids)
+   {
+      List<String> pubdidList = new ArrayList<String>();
+      String[] pdArr = pubdids.split(";");
+      for(String pd : pdArr)
+         if(pd.length() > 0) pubdidList.add(pd);
+
+      String[] pubdidArr = new String[pubdidList.size()];
+
+      return pubdidList.toArray(pubdidArr);
+   }
+
+   private Map<SodaParam, String[]> collectSodaParams(UWSJob job)
+   {
+      Map<SodaParam, String[]> params = new HashMap<SodaParam, String[]>();
+      for(SodaParam paramToken : SodaParam.values())
+      {
+         String[] paramValue = (String[])job.getAdditionalParameterValue(paramToken.toString());
+         params.put(paramToken, paramValue);
+      }
+      return params;
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/UWSSoda.java b/data-access/servlet/src/main/java/webapi/UWSSoda.java
new file mode 100644
index 0000000000000000000000000000000000000000..86d8d877f3af763151f48d1f2e66647a9fdac6e7
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/UWSSoda.java
@@ -0,0 +1,80 @@
+import java.io.IOException;
+import java.io.PrintWriter;
+
+import javax.servlet.ServletException;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+
+import uws.UWSException;
+import uws.job.ErrorType;
+import uws.job.JobList;
+import uws.job.JobThread;
+import uws.job.UWSJob;
+import uws.job.parameters.InputParamController;
+import uws.job.parameters.NumericParamController;
+import uws.job.parameters.StringParamController;
+import uws.job.user.JobOwner;
+import uws.service.UWSServlet;
+import uws.service.UWSUrl;
+
+public class UWSSoda extends UWSServlet {
+   private static final long serialVersionUID = 1L;
+
+   public static final Settings settings = Settings.getInstance();
+         
+   public static final String csvSurveysFile = settings.fitsPaths.FITSpath + "/" + "survey_populate.csv"; // FIXME make it a settings
+   public static final Subsurvey[] subsurveys = Subsurvey.loadSubsurveys(csvSurveysFile);
+
+   /* REQUIRED
+    * Initialize your UWS. At least, you should create one jobs list. */
+   @Override
+   public void initUWS() throws UWSException{
+      addJobList(new JobList("soda_cuts"));
+
+      addExpectedAdditionalParameter("ID");
+      //setInputParamController("ID", new StringParamController("ID"));
+
+      addExpectedAdditionalParameter("CIRCLE");
+      addExpectedAdditionalParameter("BAND");
+
+      /* non standard params */
+      addExpectedAdditionalParameter("skysystem");
+      addExpectedAdditionalParameter("specsystem");
+
+      //setInputParamController("l", new NumericParamController());
+      //setInputParamController("vt", new StringParamController("1", "1", new String[]{"1","2"}, false));
+   }
+
+   /*
+    * REQUIRED
+    * Create instances of jobs, but only the "work" part. The "work" and the description of the job (and all the provided parameters)
+    * are now separated and only kept in the UWSJob given in parameter. This one is created automatically by the API.
+    * You just have to provide the "work" part.
+    */
+   @Override
+   public JobThread createJobThread(UWSJob job) throws UWSException{
+      if (job.getJobList().getName().equals("soda_cuts"))
+         return new UWSSodaWork(job);
+      else
+         throw new UWSException("Impossible to create a job inside the jobs list \"" + job.getJobList().getName() + "\" !");
+   }
+
+   /* OPTIONAL
+    * By overriding this function, you can customize the root of your UWS.
+    * If this function is not overridden an XML document which lists all registered jobs lists is returned. */
+   @Override
+   protected void writeHomePage(UWSUrl requestUrl, HttpServletRequest req, HttpServletResponse resp, JobOwner user) throws UWSException, ServletException, IOException{
+      PrintWriter out = resp.getWriter();
+
+      out.println("<html><head><title>Soda UWSv4 service</title></head><body>");
+      out.println("<h1>Soda UWS service</h1");
+      out.println("<p>Available job lists:</p>");
+
+      out.println("<ul>");
+      for(JobList jl : this){
+         out.println("<li>" + jl.getName() + " - " + jl.getNbJobs() + " jobs - <a href=\"" + requestUrl.listJobs(jl.getName()) + "\">" + requestUrl.listJobs(jl.getName()) + "</a></li>");
+      }
+      out.println("</ul>");
+   }
+
+}
diff --git a/data-access/servlet/src/main/java/webapi/UWSSodaWork.java b/data-access/servlet/src/main/java/webapi/UWSSodaWork.java
new file mode 100644
index 0000000000000000000000000000000000000000..d15bf7fa9d810a1fb4334124b4bb6ede3877a5c2
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/UWSSodaWork.java
@@ -0,0 +1,176 @@
+import java.io.PrintWriter;
+import java.io.BufferedOutputStream;
+import java.io.OutputStream;
+import java.io.IOException;
+import java.io.OutputStreamWriter;
+
+import java.io.File;
+import java.io.FileInputStream;
+
+import uws.UWSException;
+import uws.job.ErrorType;
+import uws.job.JobThread;
+import uws.job.Result;
+import uws.job.UWSJob;
+import uws.service.UWSUrl;
+
+import java.util.Map;
+import java.util.HashMap;
+
+import java.security.Principal;
+
+public class UWSSodaWork extends JobThread
+{
+   private Settings     settings    = UWSSoda.settings;
+   private Subsurvey[]  subsurveys  = UWSSoda.subsurveys;
+
+   final String RESPONSE_ENCODING = "utf-8";
+   final String DEFAULT_RESPONSEFORMAT = settings.defaults.responseFormat;
+   final String DEFAULT_SKY_SYSTEM     = settings.defaults.skySystem;
+   final String DEFAULT_SPEC_SYSTEM    = settings.defaults.specSystem;
+
+   /* NOTE needed if cutouts dir served by vlkb-datasets */
+   private String webappRootRequestUrl = null;
+
+
+
+   public UWSSodaWork(UWSJob j) throws UWSException
+   {
+      super(j);
+      UWSUrl url = j.getUrl();
+      webappRootRequestUrl = url.getUrlHeader();
+   }
+
+
+
+   @Override
+   protected void jobWork() throws UWSException, InterruptedException
+   {
+      // result.setSize(size); // FIXME how to add this? not set in ServeltCutout either (only inside DataLink)
+
+      long startTime_msec = System.currentTimeMillis();
+      boolean showDuration = true;
+
+      final String RESPONSE_ENCODING = "utf-8";
+
+      try
+      {
+         Map<SodaParam, String[]> params = collectSodaParams(job);
+         SodaParser parser = new SodaParser(params);
+
+         String id   = null;
+         Pos    pos  = null;
+         Band   band = null;
+         Time   time = null;
+         Pol    pol  = null;
+
+         if(parser.sodaReq_hasSodaId())
+         {
+            id   = parser.sodaReq_getId();
+            pos  = parser.sodaReq_getPosCirclePolygon();
+            band = parser.sodaReq_getBand();
+            time = parser.sodaReq_getTime();
+            pol  = parser.sodaReq_getPol();
+         }
+         else
+         {
+            id   = parser.vlkbReq_getPubdid();
+            pos  = parser.vlkbReq_getCircleRect();
+            band = parser.vlkbReq_getVelocity();
+         }
+
+         Coord coord = new Coord(DEFAULT_SKY_SYSTEM, pos, DEFAULT_SPEC_SYSTEM, band, time, pol);
+
+         final String respContentType = "application/fits"; // FIXME parse this from RESPONSEFORMAT param
+
+         // implementation
+
+         Resolver rsl = new Resolver(settings);
+         rsl.resolve(id);
+
+         /* metadata/subsurveys */
+         FitsCard[] extraCards = null;
+         if(rsl.subsurveyId != null)
+         {
+            extraCards = Subsurvey.subsurveysFindCards(subsurveys, rsl.subsurveyId);
+         }
+
+         Datasets datasets = new DatasetsImpl(settings);
+
+         final String DEFAULT_TIME_SYSTEM = "MJD_UTC"; // FIXME take from confif file
+
+         if(pos  != null) pos.setSystem(Pos.System.valueOf(DEFAULT_SKY_SYSTEM));
+         if(band != null) band.setSystem(Band.System.valueOf(DEFAULT_SPEC_SYSTEM));
+         if(time != null) time.setSystem(Time.System.valueOf(DEFAULT_TIME_SYSTEM));
+
+         CutResult cutResult = datasets.doCutoutFile(rsl.relPathname, rsl.hdunum, pos, band, time, pol, false, null);
+
+         DataLink respDataLink = new DataLink(cutResult);
+
+         respDataLink.inputs = new Inputs(id, coord, false);//countNullValues);
+
+         /* send Results */
+
+         Result result = createResult("cutout");
+         result.setMimeType(respContentType);
+         OutputStream respOutputStream = getResultOutput(result);
+
+         if(respContentType.equals("text/xml") || respContentType.equals("application/xml"))
+         {
+            PrintWriter writer = new PrintWriter(new OutputStreamWriter(respOutputStream, RESPONSE_ENCODING));
+            XmlSerializer.serializeToLegacyCutResults(writer, RESPONSE_ENCODING, respDataLink, showDuration, startTime_msec);
+            writer.close();
+         }
+         else if(respContentType.equals("application/fits"))
+         {
+            File downloadFile = new File(respDataLink.absCutPathname);
+            FileInputStream input = new FileInputStream(downloadFile);
+
+            input.transferTo(respOutputStream);
+
+            downloadFile.delete();
+         }
+         else
+         {
+            throw new AssertionError("Unsupported contentType for response: " + respContentType);
+         }
+
+         respOutputStream.close();
+         publishResult(result);
+      }
+      catch(IllegalArgumentException ex)
+      {
+         throw new UWSException(UWSException.INTERNAL_SERVER_ERROR, ex,
+               "Illegal arg exception " + job.getJobId() + " !", ErrorType.TRANSIENT);
+      }
+      catch(IOException ex)
+      {
+         throw new UWSException(UWSException.INTERNAL_SERVER_ERROR, ex,
+               "IO exception " + job.getJobId() + " !", ErrorType.TRANSIENT);
+      }
+   }
+
+
+   /* see UWSParameters::getAdditionalParameter -> UWSParameters.java.l465: ...returned value maybe an array... */
+   /* Object getAdditionalParameterValue(String paramName) --> calls UWSParameters::getAdditionalParameters()   */
+   private Map<SodaParam, String[]> collectSodaParams(UWSJob job)
+   {
+      Map<SodaParam, String[]> params = new HashMap<SodaParam, String[]>();
+      for(SodaParam paramToken : SodaParam.values())
+      {
+         String[] paramValue = (String[])job.getAdditionalParameterValue(paramToken.toString());
+         params.put(paramToken, paramValue);
+      }
+      return params;
+   }
+
+   /*
+      Map<String, String[]> params = Map.of(
+      "ID" , new String[]{pubdid},
+      "CIRCLE", new String[]{circleStr},
+      "BAND", new String[]{bandStr},
+      "skysystem", new String[]{skysystemStr},
+      "specsystem", new String[]{specsystemStr}
+      );
+      */
+}
diff --git a/data-access/servlet/src/main/java/webapi/output/Inputs.java b/data-access/servlet/src/main/java/webapi/output/Inputs.java
new file mode 100644
index 0000000000000000000000000000000000000000..cacfa96ef18794017dd3c1292d2c97f78ea3b453
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/output/Inputs.java
@@ -0,0 +1,32 @@
+
+
+import java.lang.StringBuilder;
+
+
+class Inputs
+{
+   String pubdid;
+   Coord coord;
+   boolean countNullValues;
+   AuthPolicy auth;
+
+   public Inputs(AuthPolicy auth, String pubdid, Coord coord, boolean countNullValues)
+   {
+      this.pubdid = pubdid;
+      this.coord = coord;
+      this.countNullValues = countNullValues;
+      this.auth = auth;
+   }
+
+   public Inputs(AuthPolicy auth, Coord coord, boolean countNullValues)
+   {
+      this(auth, null, coord, countNullValues);
+   }
+
+   public Inputs(String pubdid, Coord coord, boolean countNullValues)
+   {
+      this(null, pubdid, coord, countNullValues);
+   }
+
+}
+
diff --git a/data-access/servlet/src/main/java/webapi/output/XmlSerializer.java b/data-access/servlet/src/main/java/webapi/output/XmlSerializer.java
new file mode 100644
index 0000000000000000000000000000000000000000..9ccfc544d58bbe7d19929c835657de60b060065e
--- /dev/null
+++ b/data-access/servlet/src/main/java/webapi/output/XmlSerializer.java
@@ -0,0 +1,119 @@
+
+import java.util.logging.Logger;
+import java.io.PrintWriter;
+
+
+
+public final class XmlSerializer
+{
+   private static final Logger LOGGER = Logger.getLogger("XmlSerializer");
+
+   private XmlSerializer() {} // disables instatiation
+
+
+   public static void serializeToLegacyCutResults(PrintWriter writer, String charEncoding, DataLink dataLink,
+         boolean showDuration, long startTime_msec)
+   {
+      LOGGER.info("trace");
+
+      writer.println("<?xml version=\"1.0\" encoding=\"" + charEncoding + "\" standalone=\"yes\"?>");
+      writer.println("<results>");
+      writer.println("<description> " + dataLink.description + " </description>");
+      serialize(writer, dataLink.inputs);
+
+      if(dataLink.cut != null)
+         writer.println("<CUT> " + dataLink.cut + " </CUT>");
+      if(dataLink.accessUrl != null) 
+      {
+         writer.println("<URL> " + dataLink.accessUrl + " </URL>");
+         writer.println("<cutoutSize> " + dataLink.contentLength + " </cutoutSize>");
+      }
+      if(dataLink.nullVals != null)
+         writer.println(serialize(dataLink.nullVals ));
+ 
+      writer.println("<msg> " + dataLink.versionString + " </msg>");
+ 
+      writer.println("<DatacubeCount> " + dataLink.datacubeCount + " </DatacubeCount>");
+
+      if(showDuration)
+         writer.println("<duration unit=\"msec\">" + (System.currentTimeMillis() - startTime_msec) + "</duration>");
+      writer.println("</results>");
+  }
+
+
+   public static String serialize(NullValueCount nullVals)
+   {
+      StringBuilder xml = new StringBuilder();
+      xml.append("<nullValues>");
+      xml.append("<description> Undefined pixel count </description>");
+      xml.append("<percent>"    + nullVals.percent + "</percent>");
+      xml.append("<pixels>");
+      xml.append("<nullcount>"  + nullVals.nullCount + "</nullcount>");
+      xml.append("<totalcount>" + nullVals.totalCount + "</totalcount>");
+      xml.append("</pixels>");
+      xml.append("</nullValues>");
+      return xml.toString();
+   }
+
+   public static String serialize(Coord coord)
+   {
+      StringBuilder xml = new StringBuilder();
+      xml.append("<SkySystem>"+coord.skySystem+"</SkySystem>");
+
+      /* reconstruct VLKB-legacy param values from SODA-params */
+
+      if(coord.shape != null)
+      {
+         switch(coord.shape)
+         {
+            case "CIRCLE" :
+               xml.append("<l>"+coord.pos.circle.lon+"</l>");
+               xml.append("<b>"+coord.pos.circle.lat+"</b>");
+               xml.append("<r>"+coord.pos.circle.radius+"</r>");
+               break;
+            case "RECT"   :
+               xml.append("<l>"  + String.valueOf((coord.pos.range.lon1 + coord.pos.range.lon2)/2.0) + "</l>");
+               xml.append("<b>"  + String.valueOf((coord.pos.range.lat1 + coord.pos.range.lat2)/2.0) + "</b>");
+               xml.append("<dl>" + String.valueOf(coord.pos.range.lon2 - coord.pos.range.lon1) + "</dl>");
+               xml.append("<db>" + String.valueOf(coord.pos.range.lat2 - coord.pos.range.lat1) + "</db>");
+               break;
+            default:
+               xml.append("<shape> unknown shape: "+ coord.shape +" </shape>");
+         }
+      }
+
+      if(coord.band != null)
+      {
+         xml.append("<vl>"   + String.valueOf(coord.band.wavelength[0])  +"</vl>");
+         xml.append("<vu>"   + String.valueOf(coord.band.wavelength[1])   +"</vu>");
+         xml.append("<vtype>" + coord.specSystem + "</vtype>");
+      }
+
+      return xml.toString();
+   }
+
+   public static String serialize(AuthPolicy auth)
+   {
+      StringBuilder xml = new StringBuilder();
+      xml.append("<AccessPolicy>" + auth.getAccessPolicy() + "</AccessPolicy>");
+      String ug = auth.getUserGroupsAsString(" ");
+      if(auth.getUserName() != null) xml.append("<UserName>" + auth.getUserName() + "</UserName>");
+      if(ug            != null) xml.append("<GroupNames>" + ug + "</GroupNames>");
+      return xml.toString();
+   }
+
+
+   public static void serialize(PrintWriter writer, Inputs inputs)
+   {
+      if(inputs != null)
+      {
+         writer.println("<input>");
+         if(inputs.pubdid      != null) writer.println("<pubdid>"+inputs.pubdid+"</pubdid>");
+         if(inputs.coord       != null) writer.println(serialize(inputs.coord));
+         if(inputs.countNullValues)     writer.println("<nullvals> set </nullvals>");
+         if(inputs.auth        != null) writer.println(serialize(inputs.auth));
+         writer.println("</input>");
+      }
+   }
+
+}
diff --git a/data-access/servlet/src/main/resources/authpolicy.properties b/data-access/servlet/src/main/resources/authpolicy.properties
new file mode 100644
index 0000000000000000000000000000000000000000..d1d5756218a28b49df6e1f92a8828c9f62c24cac
--- /dev/null
+++ b/data-access/servlet/src/main/resources/authpolicy.properties
@@ -0,0 +1,7 @@
+# database for table with permissions
+db_uri=
+db_schema=
+db_user_name=
+db_password=
+
+
diff --git a/data-access/servlet/src/main/resources/cutout.properties b/data-access/servlet/src/main/resources/cutout.properties
new file mode 100644
index 0000000000000000000000000000000000000000..67c91761ef42083cd900eb4d60f5e532e9db69f8
--- /dev/null
+++ b/data-access/servlet/src/main/resources/cutout.properties
@@ -0,0 +1,45 @@
+
+## path to FITS-file collections
+# fits_path_surveys=/srv/surveys
+
+## interpretation of values in SODA POS and BAND parameters
+# default_sky_system=ICRS
+# default_spec_system=WAVE_Barycentric
+
+## MIME-type of the response
+
+# [1]:
+
+# default_response_format=application/fits
+
+# xor [2]:
+
+# default_response_format=application/fits;createfile=yes
+# fits_path_cutouts=/srv/cutouts
+# amqp_host_name=localhost
+# amqp_port=5672
+# amqp_routing_key=
+
+# xor [3]:
+
+# default_response_format=application/x-vlkb+xml
+# surveys_metadata_abs_pathname=/srv/surveys/survey_populate.csv
+# fits_path_cutouts=/srv/cutouts
+# fits_url_cutouts=
+# amqp_host_name=localhost
+# amqp_port=5672
+# amqp_routing_key=
+
+
+# other features
+
+## database for resolver by mapping: key->path/to/fitsfile
+# db_uri=
+# db_schema=
+# db_user_name=
+# db_password=
+
+## should response include duration of request execution: yes | no
+# show_duration=no
+
+
diff --git a/data-access/servlet/src/main/webapp/META-INF/context.xml b/data-access/servlet/src/main/webapp/META-INF/context.xml
new file mode 100644
index 0000000000000000000000000000000000000000..4f5f504df9c52f4119d68bf48434f3afb0ae3861
--- /dev/null
+++ b/data-access/servlet/src/main/webapp/META-INF/context.xml
@@ -0,0 +1,15 @@
+<Context docBase="/webapps/vlkb-cutout">
+
+        <Resources allowLinking="true">
+                <PostResources readOnly="false"
+                        className="org.apache.catalina.webresources.DirResourceSet"
+                        base="/srv/cutouts"
+                        webAppMount="/cutouts"/>
+                <PostResources readOnly="true"
+                        className="org.apache.catalina.webresources.DirResourceSet"
+                        base="/srv/surveys"
+                        webAppMount="/surveys"/>
+        </Resources>
+
+</Context>
+
diff --git a/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-garrtoken.xml b/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-garrtoken.xml
new file mode 100644
index 0000000000000000000000000000000000000000..4bfa3082752188be907f51e84e839bbb8bacd758
--- /dev/null
+++ b/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-garrtoken.xml
@@ -0,0 +1,178 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<!--
+ Copyright 2004-2005 Sun Microsystems, Inc.  All rights reserved.
+ Use is subject to license terms.
+-->
+
+<web-app version="2.4" xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/j2ee http://java.sun.com/xml/ns/j2ee/web-app_2_4.xsd">
+        <display-name>Via Lactea. Query FITS datacubes.</display-name>
+        <distributable/>
+
+
+
+
+        <filter>
+                <filter-name>TokenFilter</filter-name>
+                <filter-class>NeaTokenFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>TokenFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+        <filter>
+                <filter-name>AuthZFilter</filter-name>
+                <filter-class>AuthZFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>AuthZFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+
+
+
+        <servlet>
+                <servlet-name>default</servlet-name>
+                <servlet-class>
+                        org.apache.catalina.servlets.DefaultServlet
+                </servlet-class>
+                <init-param>
+                        <param-name>debug</param-name>
+                        <param-value>1</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>listings</param-name>
+                        <param-value>true</param-value>
+                </init-param>
+                <load-on-startup>1</load-on-startup>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>default</servlet-name>
+                <url-pattern>/</url-pattern>
+        </servlet-mapping>
+
+
+
+
+
+
+
+
+        <servlet>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <url-pattern>/vlkb_cutout</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <servlet-class>ServletMCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <url-pattern>/vlkb_mcutout</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_merge</servlet-name>
+                <servlet-class>ServletMerge</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_merge</servlet-name>
+                <url-pattern>/vlkb_merge</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <url-pattern>/availability</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <url-pattern>/capabilities</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_soda</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/soda</url-pattern>
+        </servlet-mapping>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/vlkb_soda</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_merge</servlet-name>
+                <servlet-class>UWSMerge</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>merge</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_merge</servlet-name>
+                <url-pattern>/uws_merge/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_mcutout</servlet-name>
+                <servlet-class>UWSMCutout</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>mcutout</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_mcutout</servlet-name>
+                <url-pattern>/uws_mcutout/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_soda</servlet-name>
+                <servlet-class>UWSSoda</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>soda_uws</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_soda</servlet-name>
+                <url-pattern>/soda_uws/*</url-pattern>
+        </servlet-mapping>
+
+</web-app>
diff --git a/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-ia2token.xml b/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-ia2token.xml
new file mode 100644
index 0000000000000000000000000000000000000000..4f93735276b275266e690bdef9e8e9eb1ae8340e
--- /dev/null
+++ b/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-ia2token.xml
@@ -0,0 +1,188 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<!--
+ Copyright 2004-2005 Sun Microsystems, Inc.  All rights reserved.
+ Use is subject to license terms.
+-->
+
+<web-app version="2.4" xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/j2ee http://java.sun.com/xml/ns/j2ee/web-app_2_4.xsd">
+        <display-name>Via Lactea. Query FITS datacubes.</display-name>
+        <distributable/>
+
+
+
+        <filter>
+                <filter-name>TokenFilter</filter-name>
+                <filter-class>it.inaf.ia2.aa.TokenFilter</filter-class>
+        </filter>
+
+        <filter-mapping>
+                <filter-name>TokenFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+        <filter>
+                <filter-name>UserTypeConverter</filter-name>
+                <filter-class>IA2TokenConvFilter</filter-class>
+        </filter>
+
+        <filter-mapping>
+                <filter-name>UserTypeConverter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+        <filter>
+                <filter-name>AuthZFilter</filter-name>
+                <filter-class>AuthZFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>AuthZFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+
+
+
+        <servlet>
+                <servlet-name>default</servlet-name>
+                <servlet-class>
+                        org.apache.catalina.servlets.DefaultServlet
+                </servlet-class>
+                <init-param>
+                        <param-name>debug</param-name>
+                        <param-value>1</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>listings</param-name>
+                        <param-value>true</param-value>
+                </init-param>
+                <load-on-startup>1</load-on-startup>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>default</servlet-name>
+                <url-pattern>/</url-pattern>
+        </servlet-mapping>
+
+
+
+
+
+
+
+
+        <servlet>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <url-pattern>/vlkb_cutout</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <servlet-class>ServletMCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <url-pattern>/vlkb_mcutout</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_merge</servlet-name>
+                <servlet-class>ServletMerge</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_merge</servlet-name>
+                <url-pattern>/vlkb_merge</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <url-pattern>/availability</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <url-pattern>/capabilities</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_soda</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/soda</url-pattern>
+        </servlet-mapping>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/vlkb_soda</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_merge</servlet-name>
+                <servlet-class>UWSMerge</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>merge</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_merge</servlet-name>
+                <url-pattern>/uws_merge/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_mcutout</servlet-name>
+                <servlet-class>UWSMCutout</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>mcutout</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_mcutout</servlet-name>
+                <url-pattern>/uws_mcutout/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_soda</servlet-name>
+                <servlet-class>UWSSoda</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>soda_uws</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_soda</servlet-name>
+                <url-pattern>/soda_uws/*</url-pattern>
+        </servlet-mapping>
+
+</web-app>
diff --git a/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-iamtoken.xml b/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-iamtoken.xml
new file mode 100644
index 0000000000000000000000000000000000000000..1c3b1637db33107ca0a8c843c3e2e5495153cd82
--- /dev/null
+++ b/data-access/servlet/src/main/webapp/WEB-INF/web-cutout-iamtoken.xml
@@ -0,0 +1,169 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<!--
+ Copyright 2004-2005 Sun Microsystems, Inc.  All rights reserved.
+ Use is subject to license terms.
+-->
+
+<web-app version="2.4" xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/j2ee http://java.sun.com/xml/ns/j2ee/web-app_2_4.xsd">
+        <display-name>Via Lactea. Query FITS datacubes.</display-name>
+        <distributable/>
+
+
+
+
+        <filter>
+                <filter-name>TokenFilter</filter-name>
+                <filter-class>IamTokenFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>TokenFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+
+
+
+        <servlet>
+                <servlet-name>default</servlet-name>
+                <servlet-class>
+                        org.apache.catalina.servlets.DefaultServlet
+                </servlet-class>
+                <init-param>
+                        <param-name>debug</param-name>
+                        <param-value>1</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>listings</param-name>
+                        <param-value>true</param-value>
+                </init-param>
+                <load-on-startup>1</load-on-startup>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>default</servlet-name>
+                <url-pattern>/</url-pattern>
+        </servlet-mapping>
+
+
+
+
+
+
+
+
+        <servlet>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <url-pattern>/vlkb_cutout</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <servlet-class>ServletMCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <url-pattern>/vlkb_mcutout</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_merge</servlet-name>
+                <servlet-class>ServletMerge</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_merge</servlet-name>
+                <url-pattern>/vlkb_merge</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <url-pattern>/availability</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <url-pattern>/capabilities</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_soda</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/soda</url-pattern>
+        </servlet-mapping>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/vlkb_soda</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_merge</servlet-name>
+                <servlet-class>UWSMerge</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>merge</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_merge</servlet-name>
+                <url-pattern>/uws_merge/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_mcutout</servlet-name>
+                <servlet-class>UWSMCutout</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>mcutout</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_mcutout</servlet-name>
+                <url-pattern>/uws_mcutout/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_soda</servlet-name>
+                <servlet-class>UWSSoda</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>soda_uws</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_soda</servlet-name>
+                <url-pattern>/soda_uws/*</url-pattern>
+        </servlet-mapping>
+
+</web-app>
diff --git a/data-access/servlet/src/main/webapp/WEB-INF/web.xml b/data-access/servlet/src/main/webapp/WEB-INF/web.xml
new file mode 100644
index 0000000000000000000000000000000000000000..5cc30e2b2624cc6cb3cc03d49cb1c03abf79c8ca
--- /dev/null
+++ b/data-access/servlet/src/main/webapp/WEB-INF/web.xml
@@ -0,0 +1,202 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<!--
+ Copyright 2004-2005 Sun Microsystems, Inc.  All rights reserved.
+ Use is subject to license terms.
+-->
+
+<web-app version="2.4" xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/j2ee http://java.sun.com/xml/ns/j2ee/web-app_2_4.xsd">
+        <display-name>Via Lactea. Query FITS datacubes.</display-name>
+        <distributable/>
+
+
+<!-- uncomment IA2 or GARR token filter to enable security
+
+        <filter>
+                <filter-name>TokenFilter</filter-name>
+                <filter-class>it.inaf.ia2.aa.TokenFilter</filter-class>
+        </filter>
+
+        <filter-mapping>
+                <filter-name>TokenFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+        <filter>
+                <filter-name>UserTypeConverter</filter-name>
+                <filter-class>IA2TokenConvFilter</filter-class>
+        </filter>
+
+        <filter-mapping>
+                <filter-name>UserTypeConverter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+
+
+        <filter>
+                <filter-name>TokenFilter</filter-name>
+                <filter-class>NeaAuthFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>TokenFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+
+-->
+<!-- in addition to one of the above token-filters, uncomment this to enable group-based authorization check
+        <filter>
+                <filter-name>AuthorizationResponseFilter</filter-name>
+                <filter-class>AuthorizationResponseFilter</filter-class>
+        </filter>
+        <filter-mapping>
+                <filter-name>AuthorizationResponseFilter</filter-name>
+                <url-pattern>/*</url-pattern>
+        </filter-mapping>
+-->
+
+
+
+        <servlet>
+                <servlet-name>default</servlet-name>
+                <servlet-class>
+                        org.apache.catalina.servlets.DefaultServlet
+                </servlet-class>
+                <init-param>
+                        <param-name>debug</param-name>
+                        <param-value>1</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>listings</param-name>
+                        <param-value>true</param-value>
+                </init-param>
+                <load-on-startup>1</load-on-startup>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>default</servlet-name>
+                <url-pattern>/</url-pattern>
+        </servlet-mapping>
+
+
+
+
+
+
+
+
+        <servlet>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_cutout</servlet-name>
+                <url-pattern>/vlkb_cutout</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <servlet-class>ServletMCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_mcutout</servlet-name>
+                <url-pattern>/vlkb_mcutout</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_merge</servlet-name>
+                <servlet-class>ServletMerge</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_merge</servlet-name>
+                <url-pattern>/vlkb_merge</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_availability</servlet-name>
+                <url-pattern>/availability</url-pattern>
+        </servlet-mapping>
+
+        <servlet>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <servlet-class>VlkbServletFile</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_vosi_capabilities</servlet-name>
+                <url-pattern>/capabilities</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>vlkb_soda</servlet-name>
+                <servlet-class>ServletCutout</servlet-class>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/soda</url-pattern>
+        </servlet-mapping>
+        <servlet-mapping>
+                <servlet-name>vlkb_soda</servlet-name>
+                <url-pattern>/vlkb_soda</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_merge</servlet-name>
+                <servlet-class>UWSMerge</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>merge</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_merge</servlet-name>
+                <url-pattern>/uws_merge/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_mcutout</servlet-name>
+                <servlet-class>UWSMCutout</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>mcutout</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_mcutout</servlet-name>
+                <url-pattern>/uws_mcutout/*</url-pattern>
+        </servlet-mapping>
+
+
+        <servlet>
+                <servlet-name>uws_soda</servlet-name>
+                <servlet-class>UWSSoda</servlet-class>
+                <init-param>
+                        <param-name>name</param-name>
+                        <param-value>soda_uws</param-value>
+                </init-param>
+                <init-param>
+                        <param-name>rootDirectory</param-name>
+                        <param-value>/tmp</param-value>
+                </init-param>
+        </servlet>
+        <servlet-mapping>
+                <servlet-name>uws_soda</servlet-name>
+                <url-pattern>/soda_uws/*</url-pattern>
+        </servlet-mapping>
+
+</web-app>
diff --git a/data-access/servlet/src/main/webapp/index.html b/data-access/servlet/src/main/webapp/index.html
new file mode 100644
index 0000000000000000000000000000000000000000..7084b04e496ddf7f1c0e0faed1754342c9460076
--- /dev/null
+++ b/data-access/servlet/src/main/webapp/index.html
@@ -0,0 +1,31 @@
+<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
+ "http://www.w3c.org/TR/xhtml1/DTD/xhtml11.dtd">
+
+<html xmlns="http://www.w3.org/1999/XHTML"
+      xml:lang="it" lang="it">
+  <head>
+    <title>ViaLactea: acces FITS files</title>
+    <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.12.4/jquery.min.js"></script>
+    <script src="xml2html.js"></script>
+  </head>
+  <body>
+    <div>
+      Space axes, galactic coordinates [l,b] and radius r, defines circle:<br />
+      <br />
+      [l,b]: <input id="lon" type="text" name="l" value="-78.8" />
+             <input id="lat" type="text" name="b" value="-1.6" />[deg]<br />
+      r: <input id="radius"  type="text" name="r" value="0.1" />[deg]<br />
+      <br />
+      Spectral axis, velocity bounds:<br />
+      <br />
+      [vlow,vup]: <input id="vl" type="text" name="vl" value="-30.0" />
+      <input id="vu" type="text" name="vu" value="20.0" />[km/s]<br />
+      <p>
+        <!--<input type="submit" value="Invia &gt;&gt;" />-->
+        <button type="button" id="invButton">Search</button>
+      </p>
+    </div>
+    
+    <div id="gendHtml" > </div>
+  </body>
+</html>
diff --git a/java-libs/README.postgresql-jar b/java-libs/README.postgresql-jar
new file mode 100644
index 0000000000000000000000000000000000000000..50eff7de0a7726529dbb324ec9ef278cd49b87fd
--- /dev/null
+++ b/java-libs/README.postgresql-jar
@@ -0,0 +1,16 @@
+
+
+postgresql-*.jar library should be loaded by tomcat: place it to its tomcat's classpath: .../tomcat/lib/
+
+Note access rights and ownership: tomcat:tomcat
+
+-rw-r-----. 1 tomcat tomcat 825943 Mar 23 08:03 /opt/tomcat/latest/lib/postgresql-42.2.5.jar
+
+also schema in URL access should be: jdbc:postgresql (not only postgresql://)
+
+jdbc:postgresql://127.0.0.1:5432/vialactea
+
+
+Manual: https://jdbc.postgresql.org/documentation/use/
+
+
diff --git a/java-libs/jjwt-api-0.11.2.jar b/java-libs/jjwt-api-0.11.2.jar
new file mode 100644
index 0000000000000000000000000000000000000000..15f236c815d33220a193cc5e3c26bab5e448683b
Binary files /dev/null and b/java-libs/jjwt-api-0.11.2.jar differ
diff --git a/java-libs/jjwt-impl-0.11.2.jar b/java-libs/jjwt-impl-0.11.2.jar
new file mode 100644
index 0000000000000000000000000000000000000000..3594cb18f648035464f9ef3866c68de5a4f01e42
Binary files /dev/null and b/java-libs/jjwt-impl-0.11.2.jar differ
diff --git a/java-libs/jjwt-jackson-0.11.2.jar b/java-libs/jjwt-jackson-0.11.2.jar
new file mode 100644
index 0000000000000000000000000000000000000000..f081d3f964a9700849c56380951d679e8d5bd24b
Binary files /dev/null and b/java-libs/jjwt-jackson-0.11.2.jar differ
diff --git a/java-libs/lib/FastInfoset-1.2.16.jar b/java-libs/lib/FastInfoset-1.2.16.jar
new file mode 100644
index 0000000000000000000000000000000000000000..6e91f1cec27721fd3d6b35522099cca9b686211d
Binary files /dev/null and b/java-libs/lib/FastInfoset-1.2.16.jar differ
diff --git a/java-libs/lib/auth-lib-2.0.0-SNAPSHOT.jar b/java-libs/lib/auth-lib-2.0.0-SNAPSHOT.jar
new file mode 100644
index 0000000000000000000000000000000000000000..b25b368692395a76cae52fb5a72998728d198e38
Binary files /dev/null and b/java-libs/lib/auth-lib-2.0.0-SNAPSHOT.jar differ
diff --git a/java-libs/lib/cache2k-api-1.2.4.Final.jar b/java-libs/lib/cache2k-api-1.2.4.Final.jar
new file mode 100644
index 0000000000000000000000000000000000000000..8075b2757c7843e83ba5d19f08b166a3b612f19e
Binary files /dev/null and b/java-libs/lib/cache2k-api-1.2.4.Final.jar differ
diff --git a/java-libs/lib/cache2k-core-1.2.4.Final.jar b/java-libs/lib/cache2k-core-1.2.4.Final.jar
new file mode 100644
index 0000000000000000000000000000000000000000..c9133750cd6b43168402a716bf7e378ccc2cad74
Binary files /dev/null and b/java-libs/lib/cache2k-core-1.2.4.Final.jar differ
diff --git a/java-libs/lib/commons-beanutils-1.8.3.jar b/java-libs/lib/commons-beanutils-1.8.3.jar
new file mode 100644
index 0000000000000000000000000000000000000000..218510bc5d65ac161aca7e4b2cb575fbbee5d313
Binary files /dev/null and b/java-libs/lib/commons-beanutils-1.8.3.jar differ
diff --git a/java-libs/lib/commons-lang3-3.12.0.jar b/java-libs/lib/commons-lang3-3.12.0.jar
new file mode 100644
index 0000000000000000000000000000000000000000..4d434a2a4554815584365348ea2cf00cdfe3d5f9
Binary files /dev/null and b/java-libs/lib/commons-lang3-3.12.0.jar differ
diff --git a/java-libs/lib/fits.jar b/java-libs/lib/fits.jar
new file mode 100644
index 0000000000000000000000000000000000000000..bcd410a9b339c07ad29e3ea2a5dc0d16783ac7de
Binary files /dev/null and b/java-libs/lib/fits.jar differ
diff --git a/java-libs/lib/istack-commons-runtime-3.0.8.jar b/java-libs/lib/istack-commons-runtime-3.0.8.jar
new file mode 100644
index 0000000000000000000000000000000000000000..8f37e950bec36e0d8b34697ce2e9e57bb2d325cb
Binary files /dev/null and b/java-libs/lib/istack-commons-runtime-3.0.8.jar differ
diff --git a/java-libs/lib/jackson-annotations-2.9.10.jar b/java-libs/lib/jackson-annotations-2.9.10.jar
new file mode 100644
index 0000000000000000000000000000000000000000..de054c66b25e559d868336a9ca85c1ce663673b1
Binary files /dev/null and b/java-libs/lib/jackson-annotations-2.9.10.jar differ
diff --git a/java-libs/lib/jackson-core-2.9.10.jar b/java-libs/lib/jackson-core-2.9.10.jar
new file mode 100644
index 0000000000000000000000000000000000000000..1b5e87ccc0adde69fc1499539a29e87a0f416832
Binary files /dev/null and b/java-libs/lib/jackson-core-2.9.10.jar differ
diff --git a/java-libs/lib/jackson-databind-2.9.10.4.jar b/java-libs/lib/jackson-databind-2.9.10.4.jar
new file mode 100644
index 0000000000000000000000000000000000000000..9045f2f40240a4b62ec959b8440d10165f0581ea
Binary files /dev/null and b/java-libs/lib/jackson-databind-2.9.10.4.jar differ
diff --git a/java-libs/lib/jakarta.activation-api-1.2.1.jar b/java-libs/lib/jakarta.activation-api-1.2.1.jar
new file mode 100644
index 0000000000000000000000000000000000000000..bbfb52ff01e082afd65ee6444f2645e999f98ee0
Binary files /dev/null and b/java-libs/lib/jakarta.activation-api-1.2.1.jar differ
diff --git a/java-libs/lib/jakarta.json-1.1.5.jar b/java-libs/lib/jakarta.json-1.1.5.jar
new file mode 100644
index 0000000000000000000000000000000000000000..f96ee9b05650b1bf38f31e554191d8c312bd8213
Binary files /dev/null and b/java-libs/lib/jakarta.json-1.1.5.jar differ
diff --git a/java-libs/lib/jakarta.json-api-1.1.5.jar b/java-libs/lib/jakarta.json-api-1.1.5.jar
new file mode 100644
index 0000000000000000000000000000000000000000..50995a4584e03fcf3a8726fd3f19d80386dee961
Binary files /dev/null and b/java-libs/lib/jakarta.json-api-1.1.5.jar differ
diff --git a/java-libs/lib/jakarta.json.bind-api-1.0.1.jar b/java-libs/lib/jakarta.json.bind-api-1.0.1.jar
new file mode 100644
index 0000000000000000000000000000000000000000..1dbfe8c848b531165b322438af2f206209e93e06
Binary files /dev/null and b/java-libs/lib/jakarta.json.bind-api-1.0.1.jar differ
diff --git a/java-libs/lib/jakarta.xml.bind-api-2.3.2.jar b/java-libs/lib/jakarta.xml.bind-api-2.3.2.jar
new file mode 100644
index 0000000000000000000000000000000000000000..b16236d561c08c177f79a99e2c7a47be1f81616d
Binary files /dev/null and b/java-libs/lib/jakarta.xml.bind-api-2.3.2.jar differ
diff --git a/java-libs/lib/javax.servlet-api-3.1.0.jar b/java-libs/lib/javax.servlet-api-3.1.0.jar
new file mode 100644
index 0000000000000000000000000000000000000000..6b14c3d267867e76c04948bb31b3de18e01412ee
Binary files /dev/null and b/java-libs/lib/javax.servlet-api-3.1.0.jar differ
diff --git a/java-libs/lib/jaxb-runtime-2.3.2.jar b/java-libs/lib/jaxb-runtime-2.3.2.jar
new file mode 100644
index 0000000000000000000000000000000000000000..62f8719664cb8db795d47b0ab5f323455bb7461a
Binary files /dev/null and b/java-libs/lib/jaxb-runtime-2.3.2.jar differ
diff --git a/java-libs/lib/jcl-over-slf4j-1.7.5.jar b/java-libs/lib/jcl-over-slf4j-1.7.5.jar
new file mode 100644
index 0000000000000000000000000000000000000000..90153b069614af87e7bc9664c6d8e4d1414c2dff
Binary files /dev/null and b/java-libs/lib/jcl-over-slf4j-1.7.5.jar differ
diff --git a/java-libs/lib/jjwt-api-0.12.3.jar b/java-libs/lib/jjwt-api-0.12.3.jar
new file mode 100644
index 0000000000000000000000000000000000000000..28a50518532b198f97aeaecd2bc7b04783544fa7
Binary files /dev/null and b/java-libs/lib/jjwt-api-0.12.3.jar differ
diff --git a/java-libs/lib/jjwt-impl-0.12.3.jar b/java-libs/lib/jjwt-impl-0.12.3.jar
new file mode 100644
index 0000000000000000000000000000000000000000..458f09c2181a64c7f33bf025bd7c82004d6b57ef
Binary files /dev/null and b/java-libs/lib/jjwt-impl-0.12.3.jar differ
diff --git a/java-libs/lib/jjwt-jackson-0.12.3.jar b/java-libs/lib/jjwt-jackson-0.12.3.jar
new file mode 100644
index 0000000000000000000000000000000000000000..c7da9cddec6bd5fa0ef8ba4618a0837434862be0
Binary files /dev/null and b/java-libs/lib/jjwt-jackson-0.12.3.jar differ
diff --git a/java-libs/lib/json-simple-1.1.1.jar b/java-libs/lib/json-simple-1.1.1.jar
new file mode 100644
index 0000000000000000000000000000000000000000..dfd5856d0cad81dfe7845ea6ff4d170d8064d7b0
Binary files /dev/null and b/java-libs/lib/json-simple-1.1.1.jar differ
diff --git a/java-libs/lib/logback-classic-1.0.13.jar b/java-libs/lib/logback-classic-1.0.13.jar
new file mode 100644
index 0000000000000000000000000000000000000000..80bf5d15a20cb055ef1cad792663c2151c979fed
Binary files /dev/null and b/java-libs/lib/logback-classic-1.0.13.jar differ
diff --git a/java-libs/lib/logback-core-1.0.13.jar b/java-libs/lib/logback-core-1.0.13.jar
new file mode 100644
index 0000000000000000000000000000000000000000..568ccfaae59d083b2d3230e1b7aeeab012defa8d
Binary files /dev/null and b/java-libs/lib/logback-core-1.0.13.jar differ
diff --git a/java-libs/lib/mysql-connector-java-5.1.13-bin.jar b/java-libs/lib/mysql-connector-java-5.1.13-bin.jar
new file mode 100644
index 0000000000000000000000000000000000000000..6b5b2ba2694649cd8a79db3a34a5391338beb18f
Binary files /dev/null and b/java-libs/lib/mysql-connector-java-5.1.13-bin.jar differ
diff --git a/java-libs/lib/opencsv-5.7.1.jar b/java-libs/lib/opencsv-5.7.1.jar
new file mode 100644
index 0000000000000000000000000000000000000000..ea50aa6d7caae33fdd21bc42c95c851982f316a7
Binary files /dev/null and b/java-libs/lib/opencsv-5.7.1.jar differ
diff --git a/java-libs/lib/rabbitmq-client.jar b/java-libs/lib/rabbitmq-client.jar
new file mode 100644
index 0000000000000000000000000000000000000000..e718167753cc7c337225e335c225af5e12583667
Binary files /dev/null and b/java-libs/lib/rabbitmq-client.jar differ
diff --git a/java-libs/lib/rap-client-1.0-SNAPSHOT.jar b/java-libs/lib/rap-client-1.0-SNAPSHOT.jar
new file mode 100644
index 0000000000000000000000000000000000000000..ea95f683e3e2f9a0feb2b28d843718966bdb3991
Binary files /dev/null and b/java-libs/lib/rap-client-1.0-SNAPSHOT.jar differ
diff --git a/java-libs/lib/shiro-core-1.2.2.jar b/java-libs/lib/shiro-core-1.2.2.jar
new file mode 100644
index 0000000000000000000000000000000000000000..35499f8da162a7022551a14a734dba7c17a7b939
Binary files /dev/null and b/java-libs/lib/shiro-core-1.2.2.jar differ
diff --git a/java-libs/lib/shiro-web-1.2.2.jar b/java-libs/lib/shiro-web-1.2.2.jar
new file mode 100644
index 0000000000000000000000000000000000000000..358c1102cef27dab2d88d15184a9cadcd549b49a
Binary files /dev/null and b/java-libs/lib/shiro-web-1.2.2.jar differ
diff --git a/java-libs/lib/slf4j-api-1.7.5.jar b/java-libs/lib/slf4j-api-1.7.5.jar
new file mode 100644
index 0000000000000000000000000000000000000000..8f004d3906fc9041ef1f6923b1bc9f7d522942b2
Binary files /dev/null and b/java-libs/lib/slf4j-api-1.7.5.jar differ
diff --git a/java-libs/lib/stax-ex-1.8.1.jar b/java-libs/lib/stax-ex-1.8.1.jar
new file mode 100644
index 0000000000000000000000000000000000000000..a200db53295bfa16e99e3063c98b6498f89f5939
Binary files /dev/null and b/java-libs/lib/stax-ex-1.8.1.jar differ
diff --git a/java-libs/lib/stil.jar b/java-libs/lib/stil.jar
new file mode 100644
index 0000000000000000000000000000000000000000..1c7d53a179b8dd7c4d50f91a028d533e4cf59aa6
Binary files /dev/null and b/java-libs/lib/stil.jar differ
diff --git a/java-libs/lib/txw2-2.3.2.jar b/java-libs/lib/txw2-2.3.2.jar
new file mode 100644
index 0000000000000000000000000000000000000000..0d5ac012dd14132a02347a8ad5fd8df363feca2a
Binary files /dev/null and b/java-libs/lib/txw2-2.3.2.jar differ
diff --git a/java-libs/lib/uws 4.4.jar b/java-libs/lib/uws 4.4.jar
new file mode 100644
index 0000000000000000000000000000000000000000..bcb92b4177679c1313b42f4f00f2fdc8bedd8239
Binary files /dev/null and b/java-libs/lib/uws 4.4.jar differ
diff --git a/java-libs/lib/yasson-1.0.3.jar b/java-libs/lib/yasson-1.0.3.jar
new file mode 100644
index 0000000000000000000000000000000000000000..dfa24dbe828a272788d93cf5801725d811252128
Binary files /dev/null and b/java-libs/lib/yasson-1.0.3.jar differ
diff --git a/java-libs/postgresql-42.2.5.jar b/java-libs/postgresql-42.2.5.jar
new file mode 100644
index 0000000000000000000000000000000000000000..d89d4331a40f7a6c2d748774a8ad9a304a3d209d
Binary files /dev/null and b/java-libs/postgresql-42.2.5.jar differ