You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 7-8.RSVP
Socket
Socket
Sign inDemoInstall

@sap/cds-dk

Package Overview
Dependencies
Maintainers
1
Versions
137
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

Comparing version 6.0.4 to 6.1.1

bin/pull.js

45

bin/add.js

@@ -16,27 +16,37 @@

*hana* - adds configuration for SAP HANA deployment
*hana* - adds configuration for SAP HANA deployment
*xsuaa* - adds configuration for authentication via XSUAA
*xsuaa* - adds configuration for authentication via XSUAA
*mtx* - adds configuration for multitenancy and extensibility
*multitenancy* - adds configuration for multitenancy
*approuter* - adds configuration for the Cloud Foundry approuter
*toggles* - adds configuration for feature toggles
*mta* - adds an _mta.yaml_ file for MTA-based SAP Business Technology Platform deployment
*extensibility* - adds configuration for extensibility
*cf-manifest* - adds _manifest.yml_ and _services-manifest.yml_ files for Cloud Foundry native deployment
*mtx* - adds configuration for multitenancy, feature toggles and extensibility
This approach makes use of the 'Create-Service-Push Plugin' that has to be installed
separately using _cf install-plugin Create-Service-Push_.
_cf create-service-push_ creates the service(s) and pushes the application(s) to the
SAP Business Technology Platform.
*approuter* - adds configuration for the Cloud Foundry approuter
*pipeline* - adds files for CI/CD pipeline integration
*mta* - adds an _mta.yaml_ file for MTA-based SAP Business Technology Platform deployment
*samples* - adds simple sample files
*cf-manifest* - adds _manifest.yml_ and _services-manifest.yml_ files for Cloud Foundry native deployment
*notebook* - adds a Jupyter Notebook (beta)
This approach makes use of the 'Create-Service-Push Plugin' that has to be installed
separately using _cf install-plugin Create-Service-Push_.
_cf create-service-push_ creates the service(s) and pushes the application(s) to the
SAP Business Technology Platform.
*data* - adds data content for CSN model entities (as csv files)
*pipeline* - adds files for CI/CD pipeline integration
*samples* - adds simple sample files
*data* - adds data content for CSN model entities (as csv files)
*helm* - adds helm chart folder that contains the files needed to deploy the application using Kyma.
To add a specific feature in the helm chart you can use the command 'cds add helm:<featureName>'.
For example cds add helm:xsuaa and cds add helm:html5_apps_deployer.
# OPTIONS

@@ -60,7 +70,10 @@

*cds add* hana,mta,pipeline
*cds add* mtx --for production
*cds add* mta,pipeline
*cds add* mtx
*cds add* hana,xsuaa --for production
*cds add* cf-manifest
*cds add* data --for my.namspace.MyEntity
*cds add* helm
# SEE ALSO

@@ -67,0 +80,0 @@

module.exports = Object.assign ( compile, {
options: [
'--from', '--service', '--lang', '--for', '--to', '--dest', '--log-level', '--flavor', '--dialect', '--sql-dialect', '--openapi:url'
'--from', '--service', '--lang', '--for', '--to', '--dest', '--log-level', '--flavor', '--dialect', '--sql-dialect', '--openapi:url', '--openapi:servers'
],

@@ -108,2 +108,7 @@ shortcuts: [

*--openapi:servers* <Stringified JSON Object for Open API export>
The servers definition used in the generated OpenAPI document. *--openapi:url* is
ignored when this option is specified.
*--openapi:diagram*

@@ -110,0 +115,0 @@

const [arg0, arg1, _, ...args] = process.argv // eslint-disable-line no-unused-vars
const files = require('../lib/cds').resolve(args.length ? args : '*')
const cds = require('../lib/cds')
const files = cds.resolve(args.length ? args : '*')
process.argv = [ arg0, arg1, 'ria', ...files ]

@@ -10,4 +11,5 @@

}
require ('@sap/cds-compiler/bin/cdsv2m')
require(require.resolve ('@sap/cds-compiler/bin/cdsv2m', {paths: [ cds.home, process.cwd(), __dirname ]}))
module.exports = ()=>{}

@@ -53,2 +53,7 @@ const { URLS } = require('../lib/init/constants')

*helm* - adds helm chart folder that contains the files needed to deploy the application using Kyma.
To add a specific feature in the helm chart you can use the command 'cds add helm:<featureName>'.
For example cds add helm:xsuaa and cds add helm:html5_apps_deployer.
*--java:mvn* <Comma separated maven archetype specific parameters>

@@ -55,0 +60,0 @@

@@ -74,2 +74,7 @@ const { bold } = require('../lib/util/term');

*-u* | *--username* <name>
Username for authentication with Basic Auth in test scenarios. For security reasons,
username will only be saved for app URLs that look like localhost.
*-c* | *--clientid* <clientid>

@@ -107,7 +112,8 @@

tokenStorage: options.plain ? 'plain' : 'keyring',
renewLogin: true
renewLogin: true,
saveData: true
})
delete options.plain
await require('../lib/client/auth_manager').loginAndUpdateSettings(options)
await require('../lib/client/auth_manager').login(options)
}

@@ -114,0 +120,0 @@

@@ -0,1 +1,2 @@

# Change Log

@@ -9,2 +10,39 @@

## Version 6.1.1 - 2022-08-11
### Added
- `cds import` now supports OData and SAP annotations for OData V4 imports.
- `cds compile --to openapi` defines operation-specific HTTP error response status codes with descriptions via `ErrorResponses` property of certain annotations.
- `cds compile --to openapi` now supports `--openapi:servers` option.
- `cds add multitenancy` will add feature multitenancy-specific configuration, without extensibility.
- `cds add toggles` will add feature toggle-specific configuration.
- `cds add extensibility` will add configuration for project extensibility.
- `cds add helm` now supports resource configuration for HANA deployment job and HTML5 app deployment job.
- `cds add helm` added JSON Schema for values.yaml
- `cds pull` will download the current CDS model of an extended SaaS app running with @sap/cds-mtxs.
- `cds push` will upload an extension to a SaaS app running with @sap/cds-mtxs.
### Changed
- `cds add helm` updated default resource requirements for both `java` and `nodejs` projects.
- `cds add helm` uses servicebinding.io bindings for CAP Java services, HANA and HTML5 app deployment jobs.
- `cds compile --to openapi` creates only component schemas for schemas referenced in operations and in other schemas.
- `cds import` switch from `@openapi.schema` to `@JSON.Schema`.
- `cds add mtx` will now add configuration for streamlined MTX. It effectively acts as a shortcut for `cds add multitenancy,toggles,extensibility`
- `cds add mtx` no longer includes `hana` and `xsuaa`. To achieve the same effect as before, run `cds add mtx && cds add hana,xsuaa --for production`.
- `cds bind -2 <xsuaa service instance>` binds the CDS `auth` service to the xsuaa instance. Previously `uaa` was used. This requires `@sap/cds` 6 or higher.
- `cds add lint:dev` updated to adjust to new api structure of `@sap/eslint-plugin-cds` v2.5.0
- `cds init` uses latest Maven Java archetype version 1.27.0 for creating Java projects.
- `cds login localhost:<port> -u <username>` now saves username (and empty password, if applicable) with project settings for convenience.
### Fixed
- `cds add helm:connectivity`: `connectivity.configMapName` was not used for the `connectivity-proxy-info`.
- `cds import` replaced occurrences of `\\` with `/` in the `package.json` for Linux platforms.
- `cds import` fixed `@Core.Description` and `doc` property duplication.
- `cds extend` and `cds activate` no longer save any data (this is reserved to `cds login`).
- Extensibility commands now add http (not https) to local app URLs without schema.
- Extensibility commands don't query CF any longer when run against local apps.
## Version 6.0.4 - 2022-08-02

@@ -18,3 +56,2 @@

## Version 6.0.3 - 2022-07-14

@@ -29,2 +66,4 @@

- `--vap-file` parameter of `cds deploy` is available again (removed in 6.0.0)
- `cds add helm:connectivity`: `connectivity.configMapName` was not used for the `connectivity-proxy-info`.
- `--vcap-file` parameter of `cds deploy` is available again (removed in 6.0.0)

@@ -31,0 +70,0 @@ - `cds add helm:connectivity`: `connectivity.configMapName` was not used for the `connectivity-proxy-info`.

@@ -6,4 +6,3 @@

const KindToRequiresNameMap = {
xsuaa: 'uaa',
'xsuaa-auth': 'uaa',
xsuaa: 'auth',
'hana-cloud': 'db',

@@ -10,0 +9,0 @@ hana: 'db',

@@ -45,3 +45,3 @@ const fs = require('fs').promises;

try {
const params = await AuthManager.loginAndUpdateSettings(paramValues);
const params = await AuthManager.login(paramValues);

@@ -48,0 +48,0 @@ if (process.env.DEBUG) {

@@ -80,3 +80,3 @@ const util = require('util');

static async loginAndUpdateSettings(paramValues) {
static async login(paramValues) {
SettingsManager.init();

@@ -90,3 +90,5 @@ const params = new ParamCollection(paramValues);

if (!params.has('token')) {
await SettingsManager.saveSettings(params);
if (params.get('saveData')) {
await SettingsManager.saveSettings(params);
}
throw new Error('Failed to login: no valid token or passcode provided. ' +

@@ -99,3 +101,5 @@ 'Get a passcode' + (params.get('passcodeUrl') ? ' from ' + params.get('passcodeUrl') : '') +

await SettingsManager.saveSettings(params); // saves token conditionally
if (params.get('saveData')) {
await SettingsManager.saveSettings(params); // saves token conditionally
}

@@ -102,0 +106,0 @@ return params;

@@ -74,3 +74,3 @@ const fs = require('fs');

try {
const params = await AuthManager.loginAndUpdateSettings(paramValues);
const params = await AuthManager.login(paramValues);

@@ -77,0 +77,0 @@ if (process.env.DEBUG) {

@@ -36,2 +36,3 @@ const os = require('os');

Your choice: `);
console.log();

@@ -38,0 +39,0 @@ if (!choice) {

const IP_ADDR = /^[1-9][0-9]{0,2}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}$/;
const schemaRegex = /^https?:\/\//;
const defaultSchema = 'https://';
const localhostRegex = /^(https?:\/\/)?(localhost|127\.0\.0\.1|\[?::1]?)\b/;
const httpSchema = 'http://';
const httpsSchema = 'https://';
function extractSubdomain(url) {
const hostname = new URL(url).hostname;
if (hostname === 'localhost' || IP_ADDR.test(hostname)) {
if (localhostRegex.test(hostname) || IP_ADDR.test(hostname)) {
return undefined;

@@ -15,4 +17,6 @@ }

schemaRegex,
defaultSchema,
localhostRegex,
httpSchema,
httpsSchema,
extractSubdomain
};

@@ -10,3 +10,3 @@ const Activate = require('./activate');

activate: Activate.run,
login: AuthManager.loginAndUpdateSettings,
login: AuthManager.login,
logout: AuthManager.logout,

@@ -13,0 +13,0 @@ question: Question,

@@ -166,7 +166,9 @@ const Logger = require('./helper/logging');

this.passcode = new Param('passcode', '', { obfuscate: ObfuscationLevel.partial });
this.tokenStorage = new Param('tokenStorage', undefined, { allowedValues: ['plain', 'keyring'] });
this.tokenStorage = new Param('tokenStorage', undefined, { allowedValues: ['plain', 'keyring'] });
this.plain = new Param('plain', false, { internal: true }); // for backward compatibility - ignored
this.saveData = new Param('saveData', false, { internal: true });
this.renewLogin = new Param('renewLogin', false);
this.username = new Param('username', '');
this.username = new Param('username', '', { persist: Persistence.setting }); // only saved against localhost (non-productive); as setting due to ambiguous app URL
this.password = new Param('password', '', { obfuscate: ObfuscationLevel.full });
this.isEmptyPassword = new Param('isEmptyPassword', false, { internal: true, persist: Persistence.setting }); // extra param so we can ensure we never save a password
this.clientid = new Param('clientid', '');

@@ -173,0 +175,0 @@ this.clientsecret = new Param('clientsecret', '', { obfuscate: ObfuscationLevel.partial });

@@ -12,5 +12,6 @@ const fs = require('fs');

const Question = require('./helper/question');
const { defaultSchema, extractSubdomain, schemaRegex } = require('./helper/url_helper');
const { httpSchema, httpsSchema, extractSubdomain, schemaRegex, localhostRegex } = require('./helper/url_helper');
const { ParamCollection, Persistence } = require('./params');
const { capitalize } = require('./helper/string_helper');
const axios = require('axios');

@@ -35,3 +36,4 @@ const CONFIG_SUBDIRS = {

const MTX_FULLY_QUALIFIED = 'com.sap.cds.mtx';
const OAUTH_PATH = '/mtx/v1/oauth/token';
const OAUTH_PATH_OLD_MTX = '/mtx/v1/oauth/token';
const OAUTH_PATH_STREAMLINED_MTX = '/-/cds/login/token';

@@ -93,3 +95,3 @@ const SETTINGS_DIR = path.join(os.homedir(), CONFIG_SUBDIRS[os.platform()] || '', MTX_FULLY_QUALIFIED);

function getTokenBaseUrl(params, renewUrl) {
async function getTokenBaseUrl(params, renewUrl) {
if (params.has('tokenUrl') && !renewUrl) {

@@ -102,4 +104,26 @@ return params.get('tokenUrl');

const baseUrl = new URL(params.get('appUrl'));
baseUrl.pathname = baseUrl.pathname.replace(/\/?$/, OAUTH_PATH);
return baseUrl.toString();
const cleanPathname = pathname => baseUrl.pathname.replace(/\/?$/, pathname);
const getBaseUrl = pathname => {
const url = new URL(baseUrl.toString());
url.pathname = cleanPathname(pathname);
return url;
};
let existingPath;
const oauthPaths = [OAUTH_PATH_STREAMLINED_MTX/*, OAUTH_PATH_OLD_MTX*/]; // The old MTX does not support HEAD requests
for (const path of oauthPaths) {
try {
await axios.head(getBaseUrl(path).toString());
existingPath = path;
break;
} catch (e) {
if (e.response?.status === 404) {
continue;
} else {
throw Error(`Failed with root cause ${e.message}`);
}
}
}
existingPath = existingPath ?? OAUTH_PATH_OLD_MTX;
return getBaseUrl(existingPath).toString();
}

@@ -141,3 +165,12 @@

await this._saveToFile(params.get('projectFolder'), params.toValueMap(Persistence.setting));
const paramsMap = params.toValueMap(Persistence.setting);
if (localhostRegex.test(params.get('appUrl'))) {
Logger.log(`Saving username${(params.get('isEmptyPassword') ? ' and empty-password hint' : '')} with project settings.`);
} else {
// Save only against localhost for security reasons.
Logger.debug('Not saving username because app is not recognized to run on localhost.');
delete paramsMap.username;
delete paramsMap.isEmptyPassword;
}
await this._saveToFile(params.get('projectFolder'), paramsMap);

@@ -181,5 +214,12 @@ if (!params.has('token')) {

async function addAppUrlAndSubdomain() {
if (params.has('appUrl') && !schemaRegex.test(params.get('appUrl'))) {
params.set('appUrl', defaultSchema + params.get('appUrl'));
Logger.debug(`Added schema to app URL: ${params.get('appUrl')}`);
if (params.has('appUrl')) {
const appUrl = params.get('appUrl');
if (!schemaRegex.test(appUrl)) {
if (localhostRegex.test(appUrl)) {
params.set('appUrl', httpSchema + appUrl);
} else {
params.set('appUrl', httpsSchema + appUrl);
}
Logger.debug(`Added schema to app URL: ${params.get('appUrl')}`);
}
}

@@ -190,3 +230,3 @@ if (!logout) {

await getAppUrlAndSubdomainFromSuggestions(params);
} else if (!params.has('subdomain')) {
} else if (!params.has('subdomain') && !localhostRegex.test(params.get('appUrl'))) {
Logger.debug('Subdomain not given');

@@ -203,13 +243,14 @@ const appName = await getAppName(params.get('appUrl'));

if (!params.has('subdomain')) {
params.set('subdomain', extractSubdomain(params.get('appUrl')));
if (!params.has('subdomain')) {
const subdomain = extractSubdomain(params.get('appUrl'));
if (!subdomain) {
throw new Error('Subdomain not given');
}
Logger.log(`Subdomain determined from app URL: ${params.get('subdomain')}`);
params.set('subdomain', subdomain);
Logger.log(`Subdomain determined from app URL: ${subdomain}`);
}
}
function renewTokenUrl() {
async function renewTokenUrl() {
const renewUrl = params.get('appUrl') !== loadedAppUrl;
if (!params.get('tokenUrl') || renewUrl) {
params.set('tokenUrl', getTokenBaseUrl(params, renewUrl));
params.set('tokenUrl', await getTokenBaseUrl(params, renewUrl));
}

@@ -221,3 +262,3 @@ }

if (!logout) {
renewTokenUrl();
await renewTokenUrl();
}

@@ -248,5 +289,7 @@ }

async function addPassword() {
params.set('password', await Question.askQuestion('Password: ', undefined, true));
if (params.get('password') === '') {
throw new Error('Password cannot be empty');
if (params.get('isEmptyPassword')) {
params.set('password', '');
} else {
params.set('password', await Question.askQuestion('Password: ', undefined, true));
console.log();
}

@@ -256,2 +299,3 @@ }

params.set('clientsecret', await Question.askQuestion('clientsecret: ', undefined, true));
console.log();
if (params.get('clientsecret') === '') {

@@ -264,3 +308,3 @@ throw new Error('clientsecret cannot be empty');

params.set('passcode', await Question.askQuestion(prompt, undefined, true));
console.log(); // DON'T: process.stdout.write('\n');
console.log();
if (params.get('passcode') === '') {

@@ -303,6 +347,15 @@ throw new Error('Passcode cannot be empty');

if (params.has('username')) {
if (!params.get('password') && !logout) {
if (!params.has('password') && !logout) {
await addPassword();
}
if (params.get('password') === '') {
params.set('isEmptyPassword', true);
// TODO check password only in production env
// throw new Error('Password cannot be empty');
} else {
params.delete('isEmptyPassword');
}
return;
} else {
params.delete('isEmptyPassword');
}

@@ -309,0 +362,0 @@

@@ -34,3 +34,2 @@ const csdl2openapi = require('./csdl2openapi')

const openapi = csdl2openapi.csdl2openapi(csdl, options);
csdl2openapi.deleteUnreferencedSchemas(openapi);
return openapi;

@@ -37,0 +36,0 @@ }

@@ -376,3 +376,4 @@ 'use strict';

if (!requires[service]) {
const model = path.relative(cwd, dest);
const initial_model = path.relative(cwd, dest);
const model = isLinux() ? initial_model.replace(/\\/g, '/') : initial_model;
cds.env.requires[service] = requires[service] = { kind: version, model };

@@ -413,2 +414,9 @@ await write(package_json, JSON.stringify(conf, null, ' '));

function isLinux(){
let platform = process.platform
if (platform === 'linux' || platform === 'win32'){
return true;
}
}
function readEnvVariables(options, cliCheck) {

@@ -415,0 +423,0 @@ if (cds.env.import) {

@@ -1146,3 +1146,2 @@ /**

if (associationSet === undefined) {
console.log(relationshipName + " " + parserContext.serviceNamespace);
throw new Error(messages.MISSING_ASSOCIATION_SETS);

@@ -1149,0 +1148,0 @@ }

@@ -46,2 +46,23 @@ /* eslint-disable no-prototype-builtins */

const known_vocabularies = {
'Org.OData.Authorization.V1': 'Authorization',
'Org.OData.Aggregation.V1': 'Aggregation',
'Org.OData.Core.V1': 'Core',
'Org.OData.Capabilities.V1': 'Capabilities',
'Org.OData.Validation.V1': 'Validation',
'Org.OData.Measures.V1': 'Measures',
'Org.OData.Repeatability.V1': 'Repeatability',
'com.sap.vocabularies.Analytics.v1': 'Analytics',
'com.sap.vocabularies.CodeList.v1': 'CodeList',
'com.sap.vocabularies.Common.v1': 'Common',
'com.sap.vocabularies.Communication.v1': 'Communication',
'com.sap.vocabularies.Graph.v1': 'Graph',
'com.sap.vocabularies.HTML5.v1': 'HTML5',
'com.sap.vocabularies.ODM.v1': 'ODM',
'com.sap.vocabularies.PersonalData.v1': 'PersonalData',
'com.sap.vocabularies.Session.v1': 'Session',
'com.sap.vocabularies.UI.v1': 'UI',
};
function _initialize(parserContext) {

@@ -887,3 +908,3 @@ parserContext.unboundedActions = {};

// Step 7: get all annotations
_generateDocumentation(schemaData, csn.definitions);
_generateDocumentation(edmxAsJson, schemaData, csn.definitions, parserContext);

@@ -897,3 +918,3 @@ return JSON.stringify(csn, null, 4);

function _checkAnnotatationTerm(annotation, path, no_of_terms, targetString) {
function _checkAnnotatationTerm(annotation, path, no_of_terms, targetString, vocab) {
if (path) {

@@ -909,22 +930,46 @@ if (targetString === "@Core.Description" && annotation[1]["@Core.Description"]) {

}
else if (vocab) {
if (vocab.includes(targetString.substring(targetString.indexOf('@') + 1, targetString.lastIndexOf('.')))) {
path[targetString] = annotation[1][targetString];
}
}
}
}
function _checkAnnotationTarget(annotation, cdsDefs, str, no_of_terms) {
if (annotation[0].indexOf('/') === -1 && annotation[0].indexOf('(') === -1) {
if (annotation[0].endsWith(".EntityContainer")) {
let path = annotation[0].split(".")[0];
_checkAnnotatationTerm(annotation, cdsDefs[path], no_of_terms, str);
}
function checkEntitySetMap(pathName, parserContext) {
if (parserContext.entitySetToEntityTypeMap.get(pathName) != null) {
pathName = parserContext.namespace + '.' + parserContext.entitySetToEntityTypeMap.get(pathName)[0];
}
return pathName;
}
function _checkAnnotationTarget(annotation, cdsDefs, str, no_of_terms, vocab, parserContext) {
if ((annotation[0].indexOf('/') === -1 && annotation[0].indexOf('(') === -1) || annotation[0].includes("Container")) {
if (cdsDefs[annotation[0]]) {
_checkAnnotatationTerm(annotation, cdsDefs[annotation[0]], no_of_terms, str);
_checkAnnotatationTerm(annotation, cdsDefs[annotation[0]], no_of_terms, str, vocab);
}
} else if (annotation[0].indexOf('(') === -1 && !(annotation[0].indexOf('/') === -1)) {
if (annotation[0].indexOf('/') > 0) {
let path = annotation[0].split('/');
let temp = parserContext.namespace + '.' + path[1];
let path1 = checkEntitySetMap(temp, parserContext);
if (cdsDefs[path1]) {
_checkAnnotatationTerm(annotation, cdsDefs[path1], no_of_terms, str, vocab);
}
}
else if (annotation[0].includes("Container")) {
let path = annotation[0].substring(0,annotation[0].lastIndexOf('.'));
let actualPath = checkEntitySetMap(path, parserContext);
_checkAnnotatationTerm(annotation, cdsDefs[actualPath], no_of_terms, str, vocab);
}
}
else if (annotation[0].indexOf('(') === -1 && !(annotation[0].indexOf('/') === -1)) {
let path = annotation[0].split('/');
if (cdsDefs[path[0]]) {
_checkAnnotatationTerm(annotation, cdsDefs[path[0]]["elements"][path[1]], no_of_terms, str);
let path1 = checkEntitySetMap(path[0], parserContext);
if (cdsDefs[path1]) {
_checkAnnotatationTerm(annotation, cdsDefs[path1]["elements"][path[1]], no_of_terms, str, vocab);
}
} else {
let path = annotation[0].split('(');
if (cdsDefs[path[0]] == undefined) {
let actualPath = checkEntitySetMap(path[0], parserContext);
if (cdsDefs[actualPath] === undefined) {
let temp = path[1].split(')');

@@ -934,11 +979,11 @@ let path1 = temp[0].split(',');

if (cdsDefs[path1[0]]) {
_checkAnnotatationTerm(annotation, cdsDefs[path1[0]]["actions"][path2[1]], no_of_terms, str);
_checkAnnotatationTerm(annotation, cdsDefs[path1[0]]["actions"][path2[1]], no_of_terms, str, vocab);
}
} else {
if (path[1].indexOf('/') === -1) {
_checkAnnotatationTerm(annotation, cdsDefs[path[0]], no_of_terms, str);
_checkAnnotatationTerm(annotation, cdsDefs[path[0]], no_of_terms, str, vocab);
} else {
let path1 = path[1].split('/');
if (cdsDefs[path[0]] && cdsDefs[path[0]]["params"]) {
_checkAnnotatationTerm(annotation, cdsDefs[path[0]]["params"][path1[1]], no_of_terms, str);
_checkAnnotatationTerm(annotation, cdsDefs[path[0]]["params"][path1[1]], no_of_terms, str, vocab);
}

@@ -950,17 +995,47 @@ }

function _generateDocumentation(schemaData, cdsDefs) {
function _replaceSchemaDataAnnotation(vocabAlias, vocabNamespce, schemaDataAnnotation) {
for (let v in vocabAlias) {
if (schemaDataAnnotation.includes(vocabAlias[v])) {
schemaDataAnnotation = schemaDataAnnotation.split(vocabAlias[v]).join(known_vocabularies[vocabNamespce[v]]);
}
if (schemaDataAnnotation.includes(vocabNamespce[v])) {
schemaDataAnnotation = schemaDataAnnotation.split(vocabNamespce[v]).join(known_vocabularies[vocabNamespce[v]]);
}
}
return JSON.parse(schemaDataAnnotation);
}
function _generateDocumentation(edmxAsJson, schemaData, cdsDefs, parserContext) {
let vocabAlias = [];
let vocab = [];
let vocabNamespce = [];
if (schemaData[0].$Annotations) {
if (edmxAsJson.$Reference) {
Object.entries(edmxAsJson.$Reference).forEach(a => {
if (known_vocabularies.hasOwnProperty(a[1].$Include[0].$Namespace)) {
vocabAlias.push(a[1].$Include[0].$Alias);
vocabNamespce.push(a[1].$Include[0].$Namespace);
vocab.push(known_vocabularies[a[1].$Include[0].$Namespace])
}
});
}
schemaData[0].$Annotations = _replaceSchemaDataAnnotation(vocabAlias, vocabNamespce, JSON.stringify(schemaData[0].$Annotations));
Object.entries(schemaData[0].$Annotations).map(annotation => {
let no_of_terms;
// eslint-disable-next-line no-prototype-builtins
if (annotation[1].hasOwnProperty('@Core.Description')) {
no_of_terms = true;
_checkAnnotationTarget(annotation, cdsDefs, "@Core.Description", no_of_terms);
}
// eslint-disable-next-line no-prototype-builtins
if (annotation[1].hasOwnProperty('@Core.LongDescription')) {
_checkAnnotationTarget(annotation, cdsDefs, "@Core.LongDescription", no_of_terms);
}
Object.entries(annotation[1]).forEach(anno => {
if (vocab.includes(anno[0].substring(anno[0].indexOf('@') + 1, anno[0].lastIndexOf('.')))) {
if (annotation[1].hasOwnProperty('@Core.Description')) {
no_of_terms = true;
}
if (annotation[1].hasOwnProperty('@Common.FieldControl')) {
annotation[1]["@Common.FieldControl"] = { "#": annotation[1]["@Common.FieldControl"] }
}
if (annotation[1].hasOwnProperty('@Common.IsUpperCase')) {
annotation[1]["@Common.IsUpperCase"] = true;
}
_checkAnnotationTarget(annotation, cdsDefs, anno[0], no_of_terms, vocab, parserContext);
}
});
});

@@ -978,3 +1053,2 @@ }

function _generateEDMX2JSON(edmx, name_space, parserContext) {

@@ -981,0 +1055,0 @@ return new Promise(function getJson(resolve, reject) {

@@ -139,3 +139,3 @@ const {

cdsParam["@openapi.explode"] = true;
if (param.explode === false &&

@@ -373,7 +373,7 @@ (param.style === "form" || (!param.style && param.in === "query"))

type: "common.JSON",
"@openapi.schema": jsonSchema,
"@JSON.Schema": jsonSchema,
});
} else {
type.type = "common.JSON";
if (jsonSchema !== "{}") type["@openapi.schema"] = jsonSchema;
if (jsonSchema !== "{}") type["@JSON.Schema"] = jsonSchema;
}

@@ -380,0 +380,0 @@ }

const os = require('os');
const MAVEN_ARCHETYPE_VERSION = '1.26.1';
const MAVEN_ARCHETYPE_VERSION = '1.27.0';

@@ -19,2 +19,5 @@ const constants = {

OPTION_MTX: 'mtx',
OPTION_MULTITENANCY: 'multitenancy',
OPTION_TOGGLES: 'toggles',
OPTION_EXTENSIBILITY: 'extensibility',
OPTION_AUDITLOG: 'auditlog',

@@ -72,5 +75,8 @@ OPTION_APPROUTER: 'approuter',

constants.OPTION_CF_MANIFEST,
constants.OPTION_MTX,
constants.OPTION_MULTITENANCY,
constants.OPTION_TOGGLES,
constants.OPTION_EXTENSIBILITY,
constants.OPTION_XSUAA,
constants.OPTION_HANA,
constants.OPTION_XSUAA,
constants.OPTION_MTX,
constants.OPTION_AUDITLOG,

@@ -77,0 +83,0 @@ constants.OPTION_APPROUTER,

@@ -1,2 +0,1 @@

const fs = require('fs')
const { join } = require('path')

@@ -12,3 +11,3 @@ const TemplateBase = require('../templateBase')

approuterExtensibility, approuterExtensibilityJava // xs-app.json config
} = require('../_merging/existences')
} = require('../_merging/registry-mta')

@@ -28,5 +27,5 @@ module.exports = class ApprouterTemplate extends TemplateBase {

const projectDescriptor = await this.projectReader.read(this.options)
const mtaYAMLPath = join(this.projectPath, 'mta.yaml')
const { isNodejs, isJava, isMultitenant } = projectDescriptor.cap
if (fs.existsSync(mtaYAMLPath)) {
const { isNodejs, isJava, isExtensible, isMultitenant, hasMtaDeployment } = projectDescriptor.cap
if (hasMtaDeployment) {
const srv = isNodejs ? srvNode : srvJava

@@ -36,29 +35,19 @@ const apis = isJava ? [requiredJavaApprouterAPI, providedJavaApprouterAPI] : []

if (isJava && isMultitenant) apis.push(requiredMtxAPI)
const existences = [srv, approuter, xsuaa, ...apis]
await this.templateUtil.mergeYAML(
mtaYAMLPath,
`${__dirname}/files/mta.yaml.hbs`,
projectDescriptor,
{
existences,
relationships: [{
in: srv,
inKeyPath: ["requires"],
into: "name",
existences: [xsuaa],
existenceKeyPath: ["name"],
}, {
in: approuter,
inKeyPath: ["requires"],
into: "name",
existences: [xsuaa],
existenceKeyPath: ["name"],
}],
}
)
const additions = [srv, approuter, xsuaa, ...apis]
const mtaYAMLPath = join(this.projectPath, 'mta.yaml')
await this.templateUtil.mergeYAML(mtaYAMLPath, `${__dirname}/files/mta.yaml.hbs`, projectDescriptor, {
additions,
relationships: [{
insert: [xsuaa, 'name'],
into: [srv, 'requires', 'name']
}, {
insert: [xsuaa, 'name'],
into: [approuter, 'requires', 'name']
}],
})
}
const appPath = join(this.projectPath, projectDescriptor.ui.appPath || 'app')
const existences = isMultitenant && isJava ? [approuterExtensibilityJava] :
isMultitenant && isNodejs ? [approuterExtensibility] : []
const additions = isExtensible && isJava ? [approuterExtensibilityJava] :
isExtensible && isNodejs ? [approuterExtensibility] : []
await this.templateUtil.mergeJSON(

@@ -68,3 +57,3 @@ `${appPath}/xs-app.json`,

projectDescriptor,
{ existences }
{ additions }
)

@@ -75,2 +64,3 @@ }

const projectDescriptor = await this.projectReader.read(this.options)
const { for: forProfile } = projectDescriptor.cap
const appPath = join(this.projectPath, projectDescriptor.ui.appPath || 'app')

@@ -91,3 +81,3 @@ const appPackageJSONPath = join(appPath, 'package.json')

const cdsrcJSONPath = join(this.projectPath, '.cdsrc.json')
const cdsTemplateFile = projectDescriptor.cap.for ? 'cds.cdsrc.json.hbs' : 'cds.cdsrc.json'
const cdsTemplateFile = forProfile ? 'cds.cdsrc.json.hbs' : 'cds.cdsrc.json'
const cdsPackageJSONPath = join(__dirname, 'files', cdsTemplateFile)

@@ -99,3 +89,3 @@ await this.templateUtil.mergeJSON(cdsrcJSONPath, cdsPackageJSONPath, projectDescriptor)

const packageJSONPath = join(this.projectPath, 'package.json')
const cdsTemplateFile = projectDescriptor.cap.for ? 'cds.package.json.hbs' : 'cds.package.json'
const cdsTemplateFile = forProfile ? 'cds.package.json.hbs' : 'cds.package.json'
const cdsPackageJSONPath = join(__dirname, 'files', cdsTemplateFile)

@@ -102,0 +92,0 @@ await this.templateUtil.mergeJSON(packageJSONPath, cdsPackageJSONPath, projectDescriptor)

@@ -5,3 +5,3 @@ const fs = require('fs');

const ProjectReader = require('../../util/projectReader');
const { srvNode, srvJava, auditlog } = require('../_merging/existences');
const { srvNode, srvJava, auditlog } = require('../_merging/registry-mta');

@@ -16,5 +16,6 @@ module.exports = class AuditlogTemplate extends TemplateBase {

const projectDescriptor = await this.projectReader.read(this.options);
const { isNodejs } = projectDescriptor.cap;
const mtaYAMLPath = path.join(this.projectPath, 'mta.yaml');
if (!fs.existsSync(mtaYAMLPath)) return;
const srv = projectDescriptor.cap.isNodejs ? srvNode : srvJava
const srv = isNodejs ? srvNode : srvJava
await this.templateUtil.mergeYAML(

@@ -25,10 +26,7 @@ mtaYAMLPath,

{
existences: [srv, auditlog],
additions: [srv, auditlog],
relationships: [{
in: srv,
inKeyPath: ["requires"],
into: "name",
existences: [auditlog],
existenceKeyPath: ["name"],
}],
insert: [auditlog, 'name'],
into: [srv, 'requires', 'name']
}]
}

@@ -35,0 +33,0 @@ );

@@ -14,3 +14,3 @@ const fs = require('fs');

const { nullLogger } = require('../../util/logger');
const { srvNode, srvJava, hdbDeployer, serviceManager, hdiContainer } = require('../_merging/existences');
const { srvNode, srvJava, hdbDeployer, serviceManager, hdiContainer, mtxSidecar } = require('../_merging/registry-mta');

@@ -47,53 +47,58 @@ module.exports = class HanaTemplate extends TemplateBase {

async runDependentMerging() {
const projectDescriptor = await this.projectReader.read(this.options);
const projectDescriptor = await this.projectReader.read(this.options)
const { hasMtaDeployment, isNodejs, isJava, isMultitenant, hasHana } = projectDescriptor.cap
const mtaYAMLPath = path.join(this.projectPath, 'mta.yaml');
if (isNodejs && isMultitenant) {
await this._updateCdsConfiguration()
} else if (isJava && isMultitenant) {
await this._updateCdsrcJson()
}
if (!fs.existsSync(mtaYAMLPath)) return
if (hasMtaDeployment) {
const srv = isNodejs ? srvNode : srvJava
const db = isMultitenant ? serviceManager : hdiContainer
const deletions = isMultitenant ? [{
item: hdbDeployer,
relationships: [{
removeProperty: 'name',
allWithin: ['modules', 'requires', 'name'],
}]
}, {
item: hdiContainer,
relationships: [{
removeProperty: 'name',
allWithin: ['modules', 'requires', 'name'],
}]
}] : []
const srv = projectDescriptor.cap.isNodejs ? srvNode : srvJava
const deployer = projectDescriptor.cap.isMultitenant ? null : hdbDeployer
const db = projectDescriptor.cap.isMultitenant ? serviceManager : hdiContainer
const deletions = projectDescriptor.cap.isMultitenant ? [{
existence: hdbDeployer,
relationships: [{
allWithinKeyPath: ["modules"],
inKeyPath: ["requires"],
into: "name",
existenceKeyPath: ["name"],
const mtaYAMLPath = path.join(this.projectPath, 'mta.yaml')
const relationships = [{
insert: [db, 'name'],
into: [srv, 'requires', 'name']
}]
}, {
existence: hdiContainer,
relationships: [{
allWithinKeyPath: ["modules"],
inKeyPath: ["requires"],
into: "name",
existenceKeyPath: ["name"],
}]
}] : []
await this.templateUtil.mergeYAML(
mtaYAMLPath,
`${__dirname}/files/mta.yaml.hbs`,
projectDescriptor,
{
existences: [srv, deployer, db],
relationships: [{
in: srv,
inKeyPath: ["requires"],
into: "name",
existences: [db],
existenceKeyPath: ["name"],
},
!projectDescriptor.cap.isMultitenant && projectDescriptor.cap.isNodejs ?
{
in: deployer,
inKeyPath: ["requires"],
into: "name",
existences: [db],
existenceKeyPath: ["name"],
} : null],
deletions
if (!isMultitenant) {
relationships.push({
insert: [hdiContainer, 'name'],
into: [hdbDeployer, 'requires', 'name']
})
}
);
if (isJava && isMultitenant) {
relationships.push({
insert: [serviceManager, 'name'],
into: [mtxSidecar, 'requires', 'name']
})
}
const modules = !isMultitenant ? [srv, hdbDeployer] : [srv]
await this.templateUtil.mergeYAML(
mtaYAMLPath,
`${__dirname}/files/mta.yaml.hbs`,
projectDescriptor,
{ additions: [...modules, db].filter(a => a), deletions, relationships }
)
}
}

@@ -117,10 +122,9 @@

await this._updatePomXml();
const cdsrcJsonPath = path.join(this.projectPath, '.cdsrc.json');
await this._updateCdsrcJson(cdsrcJsonPath, env.folders.db, env);
await this._updateCdsrcJson();
break;
}
case PROJECT_TYPE.nodejs:
await this._updatePackageJson(path.join(this.projectPath, 'package.json'));
case PROJECT_TYPE.nodejs: {
await this._updatePackageJson()
break;
}
default: break;

@@ -148,11 +152,7 @@ }

async _updatePackageJson(packageJsonPath) {
const projectDescriptor = await this.projectReader.read(this.options);
async _updatePackageJson() {
const packageJsonPath = path.join(this.projectPath, 'package.json')
const dependenciesTemplatePath = path.join(__dirname, 'files', 'nodejs', 'dependencies.package.json');
await this.templateUtil.mergeJSON(packageJsonPath, dependenciesTemplatePath);
const cdsTemplateFile = projectDescriptor.cap.for ? 'cds.profile.package.json.hbs' : 'cds.package.json.hbs';
const cdsTemplatePath = path.join(__dirname, 'files', 'nodejs', cdsTemplateFile);
await this.templateUtil.mergeJSON(packageJsonPath, cdsTemplatePath, projectDescriptor);
const packageJson = await this.fsUtil.readJSON(packageJsonPath);

@@ -163,7 +163,20 @@ if (packageJson.dependencies['@sap/hana-client']) {

}
await this._updateCdsConfiguration(packageJsonPath)
}
async _updateCdsrcJson(cdsrcJsonPath) {
async _updateCdsConfiguration() {
const packageJsonPath = path.join(this.projectPath, 'package.json')
const projectDescriptor = await this.projectReader.read(this.options);
const cdsTemplateFile = projectDescriptor.cap.for ? 'profile.cdsrc.json.hbs' : 'cdsrc.json.hbs';
const { for: forProfile } = projectDescriptor.cap;
const cdsTemplateFile = forProfile ? 'cds.profile.package.json.hbs' : 'cds.package.json.hbs';
const cdsTemplatePath = path.join(__dirname, 'files', 'nodejs', cdsTemplateFile);
await this.templateUtil.mergeJSON(packageJsonPath, cdsTemplatePath, projectDescriptor, { forceOverwrite: true })
}
async _updateCdsrcJson() {
const cdsrcJsonPath = path.join(this.projectPath, '.cdsrc.json');
const projectDescriptor = await this.projectReader.read(this.options);
const { for: forProfile } = projectDescriptor.cap;
const cdsTemplateFile = forProfile ? 'profile.cdsrc.json.hbs' : 'cdsrc.json.hbs';
const cdsTemplatePath = path.join(__dirname, 'files', 'java', cdsTemplateFile);

@@ -173,4 +186,4 @@ if (!fs.existsSync(cdsrcJsonPath)) {

}
await this.templateUtil.mergeJSON(cdsrcJsonPath, cdsTemplatePath, projectDescriptor);
await this.templateUtil.mergeJSON(cdsrcJsonPath, cdsTemplatePath, projectDescriptor, { forceOverwrite: true });
}
}
const cds = require('../../../cds')
const fs = require('fs');
const path = require('path');
const path = require('path')
const ProjectReader = require('../../util/projectReader');
const ProjectReader = require('../../util/projectReader')
const TemplateBase = require('../templateBase')
const { PROJECT_TYPE, } = require('../../constants')
const { srvNode, srvJava, kibanaLogging } = require('../_merging/registry-mta')
const TemplateBase = require('../templateBase');
const { PROJECT_TYPE, } = require('../../constants');
const { srvNode, srvJava, kibanaLogging } = require('../_merging/existences');
module.exports = class KibanaTemplate extends TemplateBase {
constructor(projectPath, generator) {
super(projectPath, generator, __dirname);
this.projectReader = new ProjectReader(this);
super(projectPath, generator, __dirname)
this.projectReader = new ProjectReader(this)
}
async runDependentMerging() {
const mtaYAMLPath = path.join(this.projectPath, 'mta.yaml');
const projectDescriptor = await this.projectReader.read(this.options)
const { hasMtaDeployment, isNodejs } = projectDescriptor.cap
if (!fs.existsSync(mtaYAMLPath)) {
return;
if (hasMtaDeployment) {
const srv = isNodejs ? srvNode : srvJava
const mtaYAMLPath = path.join(this.projectPath, 'mta.yaml')
await this.templateUtil.mergeYAML(
mtaYAMLPath,
`${__dirname}/files/mta.yaml.hbs`,
projectDescriptor,
{
additions: [srv, kibanaLogging],
relationships: [{
insert: [kibanaLogging, 'name'],
into: [srv, 'requires', 'name']
}],
}
)
}
const projectDescriptor = await this.projectReader.read(this.options);
const srv = projectDescriptor.cap.isNodejs ? srvNode : srvJava
await this.templateUtil.mergeYAML(
mtaYAMLPath,
`${__dirname}/files/mta.yaml.hbs`,
projectDescriptor,
{
existences: [srv, kibanaLogging],
relationships: [{
in: srv,
inKeyPath: ["requires"],
into: "name",
existences: [kibanaLogging],
existenceKeyPath: ["name"],
}],
}
);
}
async run() {
const projectType = await this.getProjectType();
const projectType = await this.getProjectType()

@@ -51,8 +44,9 @@ switch (projectType) {

default:
cds.error(`kibana-logging is not implemented for project type '${projectType}' yet`);
cds.error(`kibana-logging is not implemented for project type '${projectType}' yet`)
// TODO: implement
break;
break
case PROJECT_TYPE.nodejs: {
const projectDescriptor = await this.projectReader.read(this.options);
const cdsTemplateFile = projectDescriptor.cap.for ? 'cds.profile.package.json.hbs' : 'cds.package.json'
const { for: forProfile } = projectDescriptor.cap
const cdsTemplateFile = forProfile ? 'cds.profile.package.json.hbs' : 'cds.package.json'
const cdsTemplatePath = path.join(__dirname, 'files', cdsTemplateFile)

@@ -59,0 +53,0 @@ await this.templateUtil.mergeJSON(path.join(this.projectPath, 'package.json'), cdsTemplatePath, projectDescriptor)

@@ -1,2 +0,2 @@

module.exports = require("@sap/eslint-plugin-cds/lib/api").createRule({
module.exports = require("@sap/eslint-plugin-cds").createRule({
meta: {

@@ -14,16 +14,17 @@ docs: {

|| ||`,
version: "1.0.0",
version: "1.0.0"
},
fixable: "code",
fixable: "code"
},
create: function (context) {
const m = context.cds.model;
m.forall("entity", (d) => {
const entityName = d.name.split(".").pop();
return { entity: checkForMoo };
function checkForMoo(e) {
const entityName = e.name.split(".").pop();
if (entityName === "Moo") {
const loc = context.cds.getLocation(entityName, d);
const loc = context.getLocation(entityName, e);
const fix = (fixer) => {
const rangeEnd = context.sourcecode.getIndexFromLoc({
const rangeEnd = context.getSourceCode().getIndexFromLoc({
line: loc.end.line,
column: loc.end.column,
column: loc.end.column
});

@@ -34,9 +35,8 @@ return fixer.insertTextAfterRange([0, rangeEnd], "n");

message: "Entity 'Moo' not allowed!",
loc,
fix,
file: d.$location.file,
node: context.getNode(e),
fix
});
}
});
},
}
}
})

@@ -14,2 +14,3 @@ const { existsSync } = require('fs');

const projectDescriptor = await this.projectReader.read();
const { hasApprouter, hasHana, hasXsuaa, isMultitenant, isExtensible } = projectDescriptor.cap;
if (!existsSync(join(this.projectPath, 'mta.yaml'))) {

@@ -22,3 +23,3 @@ await this.templateUtil.copyFiles('.', this.projectPath, projectDescriptor);

// project reader, so that there's a logical mapping from hasX to XTemplate.
if (projectDescriptor.cap.hasApprouter) {
if (hasApprouter) {
const ApprouterTemplate = require(`../approuter`)

@@ -28,3 +29,3 @@ const template = new ApprouterTemplate(this.projectPath, this.generator)

}
if (projectDescriptor.cap.hasHana) {
if (hasHana) {
const HanaTemplate = require(`../hana`)

@@ -34,3 +35,3 @@ const template = new HanaTemplate(this.projectPath, this.generator)

}
if (projectDescriptor.cap.hasXsuaa) {
if (hasXsuaa) {
const XsuaaTemplate = require(`../xsuaa`)

@@ -40,8 +41,13 @@ const template = new XsuaaTemplate(this.projectPath, this.generator)

}
if (projectDescriptor.cap.isMultitenant) {
const MtxTemplate = require(`../mtx`)
const template = new MtxTemplate(this.projectPath, this.generator)
if (isMultitenant) {
const MultitenancyTemplate = require(`../multitenancy`)
const template = new MultitenancyTemplate(this.projectPath, this.generator)
await template.runDependentMerging()
}
if (isExtensible) {
const ExtensibilityTemplate = require(`../extensibility`)
const template = new ExtensibilityTemplate(this.projectPath, this.generator)
await template.runDependentMerging()
}
}
};
}

@@ -1,13 +0,5 @@

const cds = require('../../../cds'), { fs, path } = cds.utils
const TemplateBase = require('../templateBase')
const ProjectReader = require('../../util/projectReader')
const { OPTION_HANA, OPTION_XSUAA } = require('../../constants')
const { OPTION_MULTITENANCY, OPTION_TOGGLES, OPTION_EXTENSIBILITY } = require('../../constants')
const {
srvNode, srvJava, // Server
mtxSidecar, // Additional Modules
saasRegistry, xsuaa, // BTP Services
providedMtxAPI, srvAPI, providedMtxSidecarAPI, requiredMtxSidecarAPI // APIs
} = require('../_merging/existences');
module.exports = class MtxTemplate extends TemplateBase {

@@ -20,136 +12,4 @@ constructor(projectPath, generator) {

async getDependencies() {
return [OPTION_HANA, OPTION_XSUAA]
return [OPTION_MULTITENANCY, OPTION_TOGGLES, OPTION_EXTENSIBILITY]
}
async runDependentMerging() {
const projectDescriptor = await this.projectReader.read()
const mtaYAMLPath = path.join(this.projectPath, 'mta.yaml')
if (fs.existsSync(mtaYAMLPath)) {
const mergingRules = projectDescriptor.cap.isNodejs ?
{
existences: [srvNode, saasRegistry, providedMtxAPI, xsuaa],
overwrites: [{
in: xsuaa,
keyPath: ["parameters", "config", "tenant-mode"],
replacement: "shared"
}],
relationships: [{
in: srvNode,
inKeyPath: ["requires"],
into: "name",
existences: [saasRegistry],
existenceKeyPath: ["name"],
}],
}
:
{
existences: [srvJava, srvAPI, mtxSidecar, xsuaa, saasRegistry, providedMtxAPI, providedMtxSidecarAPI, requiredMtxSidecarAPI],
overwrites: [{
in: xsuaa,
keyPath: ["parameters", "config", "tenant-mode"],
replacement: "shared"
}],
relationships: [{
in: srvJava,
inKeyPath: ["requires"],
into: "name",
existences: [saasRegistry],
existenceKeyPath: ["name"],
}],
}
await this.templateUtil.mergeYAML(
mtaYAMLPath,
path.join(__dirname, `files`, `mta.yaml.hbs`),
projectDescriptor,
mergingRules
)
}
if (projectDescriptor.cap.hasApprouter) {
const ApprouterTemplate = require(`../approuter`)
const template = new ApprouterTemplate(this.projectPath, this.generator)
await template.runDependentMerging()
}
}
async runXSSecurityMerging() {
const projectDescriptor = await this.projectReader.read()
await this.templateUtil.mergeJSON(
path.join(this.projectPath, 'xs-security.json'),
path.join(__dirname, 'files', 'xs-security.json.hbs'),
projectDescriptor,
{
existences: [{
ref: "scope-subscription",
keyPath: ["scopes"],
constraints: [{
comparisonKeyPath: ["name"],
value: "$XSAPPNAME.mtcallback"
}],
}, {
ref: "scope-update",
keyPath: ["scopes"],
constraints: [{
comparisonKeyPath: ["name"],
value: "$XSAPPNAME.mtdeployment"
}],
}, {
ref: "scope-diagnose",
keyPath: ["scopes"],
constraints: [{
comparisonKeyPath: ["name"],
value: "$XSAPPNAME.MtxDiagnose"
}],
}, {
ref: "scope-extend",
keyPath: ["scopes"],
constraints: [{
comparisonKeyPath: ["name"],
value: "$XSAPPNAME.ExtendCDS"
}],
}, {
ref: "scope-extend-delete",
keyPath: ["scopes"],
constraints: [{
comparisonKeyPath: ["name"],
value: "$XSAPPNAME.ExtendCDSdelete"
}],
}, {
ref: "template-extension-developer",
keyPath: ["role-templates"],
constraints: [{
comparisonKeyPath: ["name"],
value: "ExtensionDeveloper"
}],
}, {
ref: "template-extension-developer-undeploy",
keyPath: ["role-templates"],
constraints: [{
comparisonKeyPath: ["name"],
value: "ExtensionDeveloperUndeploy"
}],
}],
}
)
}
async run() {
const projectDescriptor = await this.projectReader.read(this.options)
if (projectDescriptor.cap.isNodejs) {
const packageJSONPath = path.join(this.projectPath, 'package.json')
const cdsTemplateFile = projectDescriptor.cap.for ? 'cds.package.json.hbs' : 'cds.package.json'
await this.templateUtil.mergeJSON(packageJSONPath, path.join(__dirname, 'files', cdsTemplateFile), projectDescriptor)
await this.templateUtil.mergeJSON(packageJSONPath, path.join(__dirname, 'files', 'dependencies.package.json'))
await this.templateUtil.sortDependencies(packageJSONPath)
} else {
const cdsrcJSONPath = path.join(this.projectPath, '.cdsrc.json')
await this.templateUtil.mergeJSON(cdsrcJSONPath, path.join(__dirname, 'files', 'java', '.cdsrc.json'))
await fs.copy(path.join(__dirname, 'files', 'java'), path.join(this.projectPath, 'mtx-sidecar'))
}
await this.runXSSecurityMerging()
await this.runDependentMerging()
}
}

@@ -67,3 +67,3 @@ const cds = require('@sap/cds');

* @abstract
* Executes the given template's merging steps.
* Executes the given template's merging steps which are dependent on another template.
* The separation from `run()` is necessary to preserve templating associativity,

@@ -75,2 +75,3 @@ * i.e. the order in which you apply them does not matter.

* Separating the two let's us run this _only_ the merging part of the template.
* In addition, this partial merging avoids cyclic template dependencies.
*/

@@ -77,0 +78,0 @@ async runDependentMerging() {

@@ -1,7 +0,6 @@

const fs = require('fs')
const path = require('path')
const TemplateBase = require('../templateBase')
const ProjectReader = require('../../util/projectReader')
const { srvNode, srvJava, xsuaa } = require('../_merging/existences')
const { PROJECT_TYPE, OPTION_HELM_UPDATE } = require('../../constants');
const { srvNode, srvJava, xsuaa, mtxSidecar } = require('../_merging/registry-mta')
const { PROJECT_TYPE, OPTION_HELM_UPDATE } = require('../../constants')

@@ -16,32 +15,43 @@ module.exports = class XsuaaTemplate extends TemplateBase {

const projectDescriptor = await this.projectReader.read(this.options)
const { projectPath } = this
const { isMultitenant, isExtensible, hasMtaDeployment, isNodejs, isJava } = projectDescriptor.cap
const mtaYAMLPath = path.join(projectPath, 'mta.yaml')
if (!fs.existsSync(mtaYAMLPath)) return
if (hasMtaDeployment) {
const srv = isNodejs ? srvNode : srvJava
const additions = [srv, xsuaa]
if (isMultitenant && isJava) additions.push(mtxSidecar)
const srv = projectDescriptor.cap.isNodejs ? srvNode : srvJava
await this.templateUtil.mergeYAML(
mtaYAMLPath,
`${__dirname}/files/mta.yaml.hbs`,
projectDescriptor,
{
existences: [srv, xsuaa],
relationships: [{
in: srv,
inKeyPath: ["requires"],
into: "name",
existences: [xsuaa],
existenceKeyPath: ["name"],
}],
}
)
const mtaYAMLPath = path.join(this.projectPath, 'mta.yaml')
const relationships = [{
insert: [xsuaa, 'name'],
into: [srv, 'requires', 'name']
}]
relationships.push({
insert: [xsuaa, 'name'],
into: [mtxSidecar, 'requires', 'name']
})
await this.templateUtil.mergeYAML(
mtaYAMLPath,
`${__dirname}/files/mta.yaml.hbs`,
projectDescriptor,
{
additions,
relationships
}
)
}
// Re-applying the merging part of `cds add mtx` because cds.compile.to.xsuaa
// has overwritten the existing `xs-security.json`.
// TODO: Find a generic solution for situation like these.
if (projectDescriptor.cap.isMultitenant) {
const MtxTemplate = require(`../mtx`)
const mtxTemplate = new MtxTemplate(this.projectPath, this.generator)
await mtxTemplate.runDependentMerging()
if (isMultitenant) {
const MultitenancyTemplate = require(`../multitenancy`)
const multitenancyTemplate = new MultitenancyTemplate(this.projectPath, this.generator)
await multitenancyTemplate.runDependentMerging()
}
if (isExtensible) {
const ExtensibilityTemplate = require(`../extensibility`)
const extensibilityTemplate = new ExtensibilityTemplate(this.projectPath, this.generator)
await extensibilityTemplate.runDependentMerging()
}
}

@@ -53,2 +63,3 @@

const projectDescriptor = await this.projectReader.read(this.options)
const { for: forProfile } = projectDescriptor.cap

@@ -59,3 +70,3 @@ const projectType = await this.getProjectType();

const cdsrcJSONPath = path.join(projectPath, '.cdsrc.json')
const cdsTemplateFile = projectDescriptor.cap.for ? 'cds.cdsrc.json.hbs' : 'cds.cdsrc.json'
const cdsTemplateFile = forProfile ? 'cds.cdsrc.json.hbs' : 'cds.cdsrc.json'
const cdsTemplatePath = path.join(templatePath, cdsTemplateFile)

@@ -67,3 +78,3 @@ await this.templateUtil.mergeJSON(cdsrcJSONPath, cdsTemplatePath, projectDescriptor)

const packageJSONPath = path.join(projectPath, 'package.json')
const cdsTemplateFile = projectDescriptor.cap.for ? 'cds.package.json.hbs' : 'cds.package.json'
const cdsTemplateFile = forProfile ? 'cds.package.json.hbs' : 'cds.package.json'
const cdsTemplatePath = path.join(templatePath, cdsTemplateFile)

@@ -83,8 +94,8 @@ await this.templateUtil.mergeJSON(packageJSONPath, cdsTemplatePath, projectDescriptor)

const mergingSemantics = { existences: xsuaa.scopes.map(scope => ({
const mergingSemantics = { additions: xsuaa.scopes.map(scope => ({
ref: scope.name,
keyPath: ["scopes"],
constraints: [{
comparisonKeyPath: ["name"],
value: scope.name
in: 'scopes',
where: [{
property: 'name',
isEqualTo: scope.name
}],

@@ -91,0 +102,0 @@ }))}

@@ -66,2 +66,3 @@ const path = require('path')

async _getCapDescriptor(env, options) {
const projectPath = this.projectPath
let capDescriptor = {

@@ -74,8 +75,13 @@ for: options && options.for,

get isMultitenant() {
return options && options.add && options.add.has('mtx') ||
return options?.add?.has('mtx') || options?.add?.has('multitenancy') ||
env.requires.multitenancy ||
env.requires.db && env.requires.db.multiTenant ||
this.isJava && fs.existsSync('mtx-sidecar') ||
env.requires.db?.multiTenant ||
this.isJava && fs.existsSync(path.join('mtx', 'sidecar')) ||
false
},
get isExtensible() {
const sidecarPackage = path.join('mtx', 'sidecar', 'package.json')
return options?.add?.has('mtx') || options?.add?.has('extensibility') || env.requires.extensibility ||
fs.existsSync(sidecarPackage) && JSON.parse(fs.readFileSync(sidecarPackage, 'utf8')).cds.requires.extensibility === true
},
get isNodejs() {

@@ -104,2 +110,5 @@ return this.pLanguage === P_LANGUAGE_NODEJS

return !!env.requires.approuter
},
get hasMtaDeployment() {
return fs.existsSync(path.join(projectPath, 'mta.yaml'))
}

@@ -398,3 +407,2 @@ }

if (resource.isMultitenant) {
//resource.name += '-mt'
resource.service = 'service-manager'

@@ -409,5 +417,2 @@ resource.vcap.plan = 'container'

case 'xsuaa':
// if (capDescriptor.isMultitenant) {
// resource.name += '-mt'
// }
resource.service = reqEntry.kind

@@ -414,0 +419,0 @@ resource.vcap.plan = 'application'

@@ -71,20 +71,18 @@ const path = require('path');

_deepMerge(target, source) {
const unique = array => [...new Set(array.map(JSON.stringify))].map(JSON.parse)
if (this._isObject(target) && this._isObject(source)) {
for (const key in source) {
if (this._isObject(source[key])) {
if (!target[key]) Object.assign(target, { [key]: source[key] });
if ('#overwrite' in source[key]) Object.assign(target, { [key]: source[key]["#overwrite"] })
else this._deepMerge(target[key], source[key]);
if (!target[key]) Object.assign(target, { [key]: source[key] })
else this._deepMerge(target[key], source[key])
} else if (Array.isArray(source[key]) && Array.isArray(target[key])) {
const unique = array => [...new Set(array.map(JSON.stringify))].map(JSON.parse);
target[key] = unique([...source[key], ...target[key]])
} else {
Object.assign(target, { [key]: target[key] || source[key] });
Object.assign(target, { [key]: target[key] || source[key] })
}
}
} else if (Array.isArray(target) && Array.isArray(source)) {
const unique = array => [...new Set(array.map(JSON.stringify))].map(JSON.parse);
target = unique([...source, ...target])
}
return target || source;
return target || source
}

@@ -96,3 +94,3 @@

if (! await this.fsUtil.pathExists(targetPath)) {
await this.fsUtil.writeJSON(targetPath, source);
await this.fsUtil.writeJSON(targetPath, source)
} else if (semantics) {

@@ -109,7 +107,7 @@ const target = await this.fsUtil.readJSON(targetPath)

} else {
const target = await this.fsUtil.readJSON(targetPath);
const sourceFile = await this._processTemplateFile(from, projectDescriptor);
const source = JSON.parse(sourceFile);
const result = target && source ? this._deepMerge(target, source) : target || source;
await this.fsUtil.writeJSON(targetPath, result);
const target = await this.fsUtil.readJSON(targetPath)
const sourceFile = await this._processTemplateFile(from, projectDescriptor)
const source = JSON.parse(sourceFile)
const result = target && source ? this._deepMerge(target, source) : target || source
await this.fsUtil.writeJSON(targetPath, result)
}

@@ -127,45 +125,43 @@ }

const existences = semantics.existences && semantics.existences.filter(e => e)
existences && existences.forEach(existence => {
const { additions, overwrites, deletions, relationships } = semantics
additions?.forEach(existence => {
existenceMap.set(existence.ref, undefined)
templateExistenceMap.set(existence.ref, undefined)
});
const { deletions } = semantics
deletions && deletions.forEach(deletion => {
existenceMap.set(deletion.existence.ref, undefined)
})
deletions?.forEach(deletion => {
existenceMap.set(deletion.item.ref, undefined)
})
relationships?.forEach(relationship => {
const [existence] = relationship.into
const [insertExistence] = relationship.insert
const ref = insertExistence.ref + ' -> ' + existence.ref
existenceMap.set(ref, undefined)
})
const missingRelationshipsMap = new Map()
const relationships = semantics.relationships && semantics.relationships.filter(r => r)
relationships && relationships.forEach(relationship =>
missingRelationshipsMap.set(relationship.in.ref, undefined)
)
let collectionStack = [target.contents]
const _getProperty = (object, keyPath) => keyPath.reduce((p, n) => p && p[n], object)
const _getYAMLProperty = (object, keyPath) => keyPath.reduce((p, n) => p && p.get(n), object)
const _getProperty = (object, keyPath) => keyPath.split('.').reduce((p, k) => p && p[k], object)
const _getYAMLProperty = (object, keyPath) => keyPath.split('.').reduce((p, k) => p && p.get(k), object)
const _validateConstraints = (node, dict, index) => {
const _addToMap = existences => existences && existences
.filter(existence => {
if (!existence.in) return true
const neededParent = existence.in && dict.get(existence.in.ref) && dict.get(existence.in.ref).node
return collectionStack.includes(neededParent)
})
.forEach(existence => {
const json = JSON.parse(String(node));
const constraintsFulfilled = existence.constraints.every(constraint =>
constraint.value === _getProperty(json, constraint.comparisonKeyPath)
)
if (constraintsFulfilled) {
const [collection] = collectionStack
dict.set(existence.ref, { json, node, index, collection })
}
const _validateWhere = (node, dict, index) => {
const _addToMap = existences => existences?.filter(existence => {
let where = existence.in
if (typeof where === 'string') return true // just a key path on the document root
else where = where[0] // a more complex list of constraints, a sequence/array is involved
const neededParent = dict.get(where.ref)?.node
return collectionStack.includes(neededParent) // lookbehind in parent collection stack
// REVISIT: Only look behind until sequence is reached?
}).forEach(({ where, ref }) => {
const json = JSON.parse(String(node))
const whereFulfilled = where.every(constraint =>
constraint.isEqualTo === _getProperty(json, constraint.property)
)
if (whereFulfilled) {
const [collection] = collectionStack
dict.set(ref, { json, node, index, collection })
}
})
_addToMap(existences)
deletions && _addToMap(deletions.map(deletion => deletion.existence))
if (additions) _addToMap(additions)
if (deletions) _addToMap(deletions.map(deletion => deletion.item))
if (relationships) _addToMap(relationships?.map(({into: [existence]}) => existence).filter(e => e))
}

@@ -176,3 +172,3 @@

if (YAML.isMap(node)) {
actions && actions.visitMap && actions.visitMap(node, index)
actions?.visitMap && actions.visitMap(node, index)
const [collection] = collectionStack

@@ -183,3 +179,3 @@ if (YAML.isSeq(collection)) {

}
} else if (YAML.isPair(node) && node.value && node.value.items) {
} else if (YAML.isPair(node) && node.value?.items) {
collectionStack.unshift(node.value)

@@ -191,4 +187,4 @@ } else if (YAML.isScalar(node) && semantics.forceOverwrite && templateNode) {

if (node.items) {
if (YAML.isSeq(node) && templateNode && templateNode.items) {
actions.mergeCollection && actions.mergeCollection(node, templateNode)
if (YAML.isSeq(node) && templateNode?.items) {
actions?.mergeCollection && actions.mergeCollection(node, templateNode)
_traverseYAMLCollection(node, actions, templateNode)

@@ -198,7 +194,7 @@ } else {

}
} else if (node.value && node.value.items || node.value && semantics.forceOverwrite) {
} else if (node.value?.items || node.value && semantics.forceOverwrite) {
_traverseYAMLNode(node.value, index, actions, templateNode && templateNode.value)
}
if (node.value && node.value.items || shifted) {
if (node.value?.items || shifted) {
collectionStack.shift()

@@ -209,17 +205,16 @@ }

const _traverseYAMLCollection = (collection, actions, templateCollection) => {
function _traverseYAMLCollection (collection, actions, templateCollection) {
if (!collection) return
const keyIndexMap = new Map()
const templateIndexMap = new Map()
const keyToIndex = new Map, templateIndexToIndex = new Map
// Map collection items to their semantic counterpart(s)
collection.items.forEach((node, i) => {
if (node.key) keyIndexMap.set(node.key.value, i)
const existence = [...existenceMap.entries()].find(([,value]) => value && value.node === node)
if (node.key) keyToIndex.set(node.key.value, i)
const existence = [...existenceMap.entries()].find(([,value]) => value?.node === node)
if (existence) {
const [existenceKey] = existence
const templateExistence = existenceKey && templateExistenceMap.get(existenceKey)
const templateIndex = templateExistence && templateExistence.index
if (templateIndex !== undefined) templateIndexMap.set(templateIndex, i)
const templateIndex = templateExistence?.index
if (templateIndex !== undefined) templateIndexToIndex.set(templateIndex, i)
}

@@ -232,4 +227,8 @@ if (!templateCollection) _traverseYAMLNode(node, i, actions)

templateCollection.items.forEach((templateNode, templateIndex) => {
if (YAML.isPair(templateNode)) {
const i = keyIndexMap.get(templateNode.key.value)
if (YAML.isScalar(templateNode)) {
if (!collection.items.map(item => item.value).includes(templateNode.value)) {
collection.add(templateNode)
}
} else if (YAML.isPair(templateNode)) {
const i = keyToIndex.get(templateNode.key.value)
const [collection] = collectionStack

@@ -243,6 +242,6 @@ const targetNode = collection.items && collection.items[i]

} else if (YAML.isMap(templateNode)) {
const targetNode = collection.items[templateIndexMap.get(templateIndex)]
const targetNode = collection.items[templateIndexToIndex.get(templateIndex)]
if (actions.mergeCollection) actions.mergeCollection(targetNode, templateNode)
if (targetNode) {
_traverseYAMLNode(targetNode, templateIndex, actions, templateNode);
_traverseYAMLNode(targetNode, templateIndex, actions, templateNode)
}

@@ -257,3 +256,3 @@ }

visitMap: (node, index) => {
_validateConstraints(node, templateExistenceMap, index)
_validateWhere(node, templateExistenceMap, index)
}

@@ -265,3 +264,3 @@ })

visitMap: (node, index) => {
_validateConstraints(node, existenceMap, index)
_validateWhere(node, existenceMap, index)
}

@@ -271,12 +270,14 @@ })

// 3. Apply overwrites to already found existences
semantics.overwrites && semantics.overwrites.forEach(({ in: inExistence, keyPath, replacement }) => {
const existence = existenceMap.get(inExistence.ref)
if (existence) {
_getYAMLProperty(existence.node, keyPath.slice(0, keyPath.length - 1))
.set(keyPath[keyPath.length - 1], replacement)
}
overwrites?.forEach(({ item, withValue }) => {
const keyPath = Array.isArray(item) && typeof item[0] === 'string' ? item[0] : typeof item === 'string' ? item : item[1]
const inExistence = Array.isArray(item) && typeof item[0] === 'object' ? item[0] : typeof item === 'object' ? item : item[1]
const node = inExistence && existenceMap.get(inExistence.ref) ? existenceMap.get(inExistence.ref).node : collectionStack[collectionStack.length - 1]
const keys = keyPath.split('.')
if (!node.getIn(keys)) return
_getYAMLProperty(node, keys.slice(0, keys.length - 1).join('.'))
.set(keys[keys.length - 1], withValue)
})
// 4. Delete existences from the project (e.g. separate deployer module when adding mtx)
semantics.deletions && semantics.deletions.forEach(({ existence: { ref }, relationships }) => {
deletions?.forEach(({ item: { ref }, relationships }) => {
const existence = existenceMap.get(ref)

@@ -286,12 +287,11 @@ if (!existence) return

relationships && relationships.forEach(relationship => {
const parent = _getYAMLProperty(target, relationship.allWithinKeyPath)
relationships?.forEach(relationship => {
const [allWithinKeyPath, inKeyPath, into] = relationship.allWithin
const parent = _getYAMLProperty(target, allWithinKeyPath)
for (const child of parent.items) {
const c = _getYAMLProperty(child, relationship.inKeyPath)
const i = c.items.findIndex(node =>
node.get(relationship.into) === _getProperty(existence.json, relationship.existenceKeyPath)
)
if (i > -1) {
c.delete(i)
}
const grandchild = _getYAMLProperty(child, inKeyPath)
const i = grandchild?.items?.findIndex(node =>
node.get(into) === _getProperty(existence.json, relationship.removeProperty)
) ?? - 1
if (i > -1) grandchild.delete(i)
}

@@ -304,4 +304,3 @@ })

mergePair: (collection, templateNode) => {
const childIsArray = false//templateNode.value.type !== 'MAP' && !!templateNode.value.items
if (YAML.isMap(collection) && !childIsArray) {
if (YAML.isMap(collection)) {
collection.add(templateNode)

@@ -313,21 +312,21 @@ }

let [,parent] = collectionStack
const missingExistences = semantics.existences
.filter(e => e)
additions?.filter(existence => {
if (typeof existence.in === 'string' || Array.isArray(existence.in) && typeof existence.in[0] === 'string') return true
const inExistence = Array.isArray(existence.in) ? existenceMap.get(existence.in[0].ref) : existenceMap.get(existence.in.ref)
return inExistence?.node === parent
})
.filter(existence => {
const requiredParent = existence.in &&
existenceMap.get(existence.in.ref) &&
existenceMap.get(existence.in.ref).node
return !existence.in || requiredParent === parent
})
.filter(({keyPath}) => {
if (keyPath.length > collectionStack.length + 1) return
if (keyPath.length > 1) parent = collectionStack[keyPath.length]
const keyPath = typeof existence.in === 'string' ? existence.in : Array.isArray(existence.in) && typeof existence.in[0] === 'string' ? existence.in[0] : existence.in[1]
const keys = keyPath.split('.')
if (keys.length > collectionStack.length + 1) return
if (keys.length > 1) parent = collectionStack[keys.length]
return _getYAMLProperty(parent, keyPath) === targetNode
})
.filter(({ref}) => !existenceMap.get(ref))
missingExistences.forEach(existence => {
const templateNode = templateExistenceMap.get(existence.ref).node
targetNode.add(templateNode)
})
.filter(({ ref }) => {
return !existenceMap.get(ref)
})
.forEach(({ ref }) => {
const templateNode = templateExistenceMap.get(ref).node
targetNode.add(templateNode)
})
}

@@ -340,3 +339,3 @@ },

visitMap: (node, index) => {
_validateConstraints(node, existenceMap, index)
_validateWhere(node, existenceMap, index)
}

@@ -350,6 +349,7 @@ })

const targetJSON = YAML.parse(String(targetNode))
const relationship = semantics.relationships && semantics.relationships
.filter(r => r)
const relationship = semantics.relationships?.filter(r => r)
.find(relationship => {
const existingNode = _getYAMLProperty(existenceMap.get(relationship.in.ref).node, relationship.inKeyPath)
const [existence, keyPath] = relationship.into
if (!existenceMap.get(existence.ref)) return false
const existingNode = _getYAMLProperty(existenceMap.get(existence.ref).node, keyPath)
return targetNode === existingNode

@@ -359,11 +359,13 @@ })

const missingPairs = relationship.existences
const intoKey = relationship.into[relationship.into.length - 1]
const [existence, existenceKeyPath] = relationship.insert
const missingPairs = [existence]
.filter(({ref}) =>
!targetJSON.some(item =>
_getProperty(existenceMap.get(ref).json, relationship.existenceKeyPath) === item[relationship.into]
_getProperty(existenceMap.get(ref).json, existenceKeyPath) === item[intoKey]
)
)
.map(({ref}) => existenceMap.get(ref).node.get(relationship.into))
.map(({ref}) => existenceMap.get(ref).node.get(intoKey))
missingPairs.forEach(pair => {
targetNode.add({ [relationship.into]: pair })
targetNode.add({ [intoKey]: pair })
})

@@ -370,0 +372,0 @@ }

@@ -37,5 +37,3 @@ const { throwError } = require('./util');

domain: null,
imagePullSecret: {
name: null
}
imagePullSecret: {}
}

@@ -42,0 +40,0 @@ }

const { yaml } = require('./modules');
const path = require('path');
const fs = require('fs').promises;

@@ -13,5 +14,5 @@ const FsUtil = require('../../lib/init/util/fsUtil')

*/
constructor({ builder, template, chartValues, chartDependencies, app, mergeSemantics }) {
constructor({ builder, resources, chartValues, chartDependencies, app, mergeSemantics }) {
this.builder = builder;
this.template = template;
this.resources = resources;
this.chartValues = chartValues;

@@ -26,2 +27,3 @@ this.app = app;

await this.copyTemplates();
await this.copySchema();
await this.addChartValues();

@@ -32,5 +34,11 @@ await this.addChartFile();

async copyTemplates() {
await this.builder.addAllTemplates(this.template);
await this.builder.addAllTemplates(this.resources);
}
async copySchema() {
let schema = JSON.parse(await fs.readFile(path.join(this.resources, 'values.schema.json'), { encoding: 'utf8' }))
await this.builder.writeChartFile('values.schema.json', JSON.stringify(schema, null, 4));
}
async addChartValues() {

@@ -37,0 +45,0 @@ const values = this.chartValues;

@@ -54,2 +54,8 @@ const { cds, yaml } = require('./modules');

} else {
const header = `
# yaml-language-server: $schema=./values.schema.json
`
data = header + data;
await this.writeChartFile(fileName, data);

@@ -56,0 +62,0 @@ }

@@ -53,8 +53,8 @@ const wrapFeature = require('../wrapFeature');

mergeSemantics.existences.push({
mergeSemantics.additions.push({
ref: `{name}-additional-volumes`,
keyPath: [name, "additionalVolumes"],
constraints: [{
comparisonKeyPath: ["name"],
value: "connectivity-secret"
in: `${name}.additionalVolumes`,
where: [{
property: 'name',
isEqualTo: 'connectivity-secret'
}]

@@ -61,0 +61,0 @@ });

@@ -22,3 +22,3 @@ const fs = require('fs').promises;

const memory = language === 'Java' ? '1G' : '500M';
const cpu = language === 'Java' ? '700m' : '100m';
const cpu = language === 'Java' ? '1000m' : '500m';
const env = language === 'Java' ? {SPRING_PROFILES_ACTIVE: "cloud"} : undefined;

@@ -36,3 +36,2 @@

limits: {
cpu,
'ephemeral-storage': '1G',

@@ -42,3 +41,3 @@ memory

requests: {
cpu: '100m',
cpu,
'ephemeral-storage': '1G',

@@ -70,3 +69,2 @@ memory

await builder.addSubchartFiles(webApplicationChartPath, 'web-application');
await builder.addChartFile(Path.join(__dirname, 'subchart-override', 'templates', '_web_application_helpers.tpl'), {fileName: Path.join('charts', 'web-application', 'templates', '_web_application_helpers.tpl')});
}

@@ -73,0 +71,0 @@ }

@@ -13,2 +13,4 @@ {

"$id": "#/properties/global",
"title": "Helm global values",
"description": "For more information, see https://helm.sh/docs/chart_template_guide/subcharts_and_globals/#global-chart-values",
"type": "object",

@@ -20,2 +22,4 @@ "x-ignore-untested": true,

"$id": "#/properties/global/imagePullSecret",
"title": "Image Pull Secret configuration",
"description": "For more information, see https://kubernetes.io/docs/tasks/configure-pod-container/pull-image-private-registry/",
"additionalProperties": true,

@@ -25,5 +29,9 @@ "type": "object",

"name": {
"title": "Secret name",
"description": "Name of the Kubernetes Secret, used as an image pull secret (must be of type kubernetes.io/dockerconfigjson). Can't be used with the `dockerconfigjson` option.",
"$ref": "#/definitions/KubernetesName"
},
"dockerconfigjson": {
"title": "Secret content",
"description": "The content for the dynamically generated Kubernetes Secret, which will be used as an image pull secret. Can't be used with the `name` option.",
"type": "string"

@@ -48,2 +56,20 @@ }

}
},
"image": {
"$id": "#/properties/global/image",
"title": "Image configuration",
"description": "Either name or repository is required.",
"type": "object",
"required": [],
"additionalProperties": true,
"properties": {
"tag": {
"$id": "#/properties/image/global/properties/tag",
"$ref": "#/definitions/ImageTag"
},
"registry": {
"$id": "#/properties/image/global/properties/registry",
"$ref": "#/definitions/ImageRegistry"
}
}
}

@@ -54,2 +80,4 @@ }

"$id": "#/properties/nameOverride",
"title": "Chart name override",
"description": "Will be used instead of the `.Chart.Name`, e.g. when generating the Deployment name.",
"type": "string",

@@ -60,2 +88,4 @@ "pattern": "[0-9a-z][0-9a-z-.]*"

"$id": "#/properties/fullnameOverride",
"title": "Override for the `.fullname` helper function.",
"description": "Will be used as an override for the `.fullname` helper function (i.e. `.Release.Name-.Chart.Name`).",
"type": "string",

@@ -66,2 +96,4 @@ "pattern": "[0-9a-z][0-9a-z-.]*"

"$id": "#/properties/replicaCount",
"title": "Replica count",
"description": "Number of desired pods within the Deployment.",
"type": "integer",

@@ -73,2 +105,4 @@ "minimum": 1,

"$id": "#/properties/port",
"title": "Port",
"description": "Application's exposed port.",
"type": "integer",

@@ -81,2 +115,4 @@ "minimum": 1,

"$id": "#/properties/serviceAccountName",
"title": "Service Account name",
"description": "Name of the Service Account assigned to pods.",
"default": "default",

@@ -91,2 +127,3 @@ "allOf": [

"$id": "#/properties/image",
"title": "Image configuration",
"type": "object",

@@ -100,2 +137,4 @@ "required": [

"$id": "#/properties/image/properties/repository",
"title": "Repository of the image",
"description": "Should also include the image name (i.e. everything before the `:` sign).",
"type": "string",

@@ -106,6 +145,7 @@ "pattern": "^[\\w-./:]*[@sha256]*$"

"$id": "#/properties/image/properties/tag",
"type": "string",
"$comment": "Copied from https://github.com/containers/image/blob/18d58d29fdc4fc32fb8a8a6d186b829b217f1bf5/docker/reference/regexp.go#L68-L70",
"pattern": "^((?:(?:[a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9-]*[a-zA-Z0-9])(?:(?:\\.(?:[a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9-]*[a-zA-Z0-9]))+)?(?::[0-9]+)?/)?[a-z0-9]+(?:(?:(?:[._]|__|[-]*)[a-z0-9]+)+)?(?:(?:/[a-z0-9]+(?:(?:(?:[._]|__|[-]*)[a-z0-9]+)+)?)+)?)(?::([\\w][\\w.-]{0,127}))?(?:@([A-Za-z][A-Za-z0-9]*(?:[-_+.][A-Za-z][A-Za-z0-9]*)*[:][[:xdigit:]]{32,}))?$",
"default": "latest"
"$ref": "#/definitions/ImageTag"
},
"registry": {
"$id": "#/properties/image/properties/registry",
"$ref": "#/definitions/ImageRegistry"
}

@@ -116,2 +156,4 @@ }

"$id": "#/properties/imagePullSecret",
"title": "Image Pull Secret configuration",
"description": "For more information, see https://kubernetes.io/docs/tasks/configure-pod-container/pull-image-private-registry/",
"additionalProperties": false,

@@ -121,5 +163,9 @@ "type": "object",

"name": {
"title": "Secret name",
"description": "Name of the Kubernetes Secret, used as an image pull secret (must be of type kubernetes.io/dockerconfigjson). Can't be used with the `dockerconfigjson` option.",
"$ref": "#/definitions/KubernetesName"
},
"dockerconfigjson": {
"title": "Secret content",
"description": "The content for the dynamically generated Kubernetes Secret, which will be used as an image pull secret. Can't be used with the `name` option.",
"type": "string"

@@ -147,2 +193,4 @@ }

"$id": "#/properties/additionalVolumes",
"title": "Additional Pod volumes",
"description": "List of volumes, which should be mounted into Pods",
"type": "array",

@@ -152,2 +200,3 @@ "additionalItems": false,

"items": {
"title": "Additional volume configuration",
"type": "object",

@@ -161,2 +210,3 @@ "additionalProperties": true,

"name": {
"title": "Name of the volume",
"type": "string",

@@ -166,2 +216,3 @@ "$ref": "#/definitions/KubernetesName"

"volumeMount": {
"title": "Volume mount configuration",
"type": "object",

@@ -174,2 +225,3 @@ "required": [

"mountPath": {
"description": "Path within a Pod, where the volume should be mounted.",
"type": "string",

@@ -179,11 +231,16 @@ "pattern": "^[^:]*$"

"mountPropagation": {
"type": "string"
"description": "Mount propagation allows for sharing volumes mounted by a container to other containers in the same pod, or even to other pods on the same node.",
"type": "string",
"enum": ["None", "HostToContainer"]
},
"readOnly": {
"description": "Whether mounted volume should be in read-only mode.",
"type": "boolean"
},
"subPath": {
"description": "Sub-path inside the referenced volume instead of its root.",
"type": "string"
},
"subPathExpr": {
"description": "Similar to the `subPath`, but can be constructed using the downward API environment variables. For more info, see https://kubernetes.io/docs/concepts/storage/volumes/#using-subpath-expanded-environment",
"type": "string"

@@ -198,2 +255,3 @@ }

"$id": "#/properties/ha",
"title": "High Availability configuration",
"type": "object",

@@ -204,2 +262,3 @@ "additionalProperties": false,

"$id": "#/properties/ha/properties/enabled",
"description": "Enables additional high-availability related configuration, like Pod Disruption Budget and Topology Spread Constraints.",
"type": "boolean",

@@ -212,2 +271,3 @@ "default": true

"$id": "#/properties/resources",
"title": "Pod resources configuration",
"type": "object",

@@ -222,2 +282,3 @@ "additionalProperties": false,

"type": "object",
"description": "Minimal required resources for the application to operate, that will be reserved for each replica.",
"additionalProperties": false,

@@ -230,13 +291,8 @@ "required": [

"cpu": {
"description": "CPU resource units, as described here https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/#meaning-of-cpu",
"$id": "#/properties/resources/properties/requests/properties/cpu",
"oneOf": [
{
"type": "number"
},
{
"type": "string"
}
]
"type": ["string", "number"]
},
"ephemeral-storage": {
"description": "Size of the local ephemeral storage, measured in bytes. For more info, see https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/#local-ephemeral-storage",
"$id": "#/properties/resources/properties/requests/properties/ephemeral-storage",

@@ -246,2 +302,3 @@ "type": "string"

"memory": {
"description": "Amount of memory, mesaured in bytes. For more info, see https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/#meaning-of-memory",
"$id": "#/properties/resources/properties/requests/properties/memory",

@@ -254,2 +311,3 @@ "type": "string"

"type": "object",
"description": "",
"additionalProperties": false,

@@ -261,13 +319,8 @@ "required": [

"cpu": {
"description": "CPU resource units, as described here https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/#meaning-of-cpu",
"$id": "#/properties/resources/properties/limits/properties/cpu",
"oneOf": [
{
"type": "number"
},
{
"type": "string"
}
]
"type": ["string", "number"]
},
"ephemeral-storage": {
"description": "Size of the local ephemeral storage, measured in bytes. For more info, see https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/#local-ephemeral-storage",
"$id": "#/properties/resources/properties/limits/properties/ephemeral-storage",

@@ -277,2 +330,3 @@ "type": "string"

"memory": {
"description": "Amount of memory, mesaured in bytes. For more info, see https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/#meaning-of-memory",
"$id": "#/properties/resources/properties/limits/properties/memory",

@@ -287,2 +341,3 @@ "type": "string"

"$id": "#/properties/health_check",
"title": "Health-check configuration",
"type": "object",

@@ -293,2 +348,3 @@ "additionalProperties": false,

"$id": "#/properties/health_check/properties/liveness",
"description": "Liveness-probe configuration.",
"type": "object",

@@ -298,2 +354,3 @@ "additionalProperties": false,

"path": {
"description": "HTTP path used by Kubernetes, to perform health-check calls.",
"type": "string",

@@ -306,2 +363,3 @@ "default": "/healthz"

"$id": "#/properties/health_check/properties/readiness",
"description": "Readiness-probe configuration.",
"type": "object",

@@ -311,2 +369,3 @@ "additionalProperties": false,

"path": {
"description": "HTTP path used by Kubernetes, to perform health-check calls.",
"type": "string",

@@ -321,2 +380,3 @@ "default": "/healthz"

"$id": "#/properties/startupTimeout",
"description": "Initial timeout in seconds, during which the app must start giving the response to the liveness-probe.",
"type": "integer",

@@ -328,2 +388,3 @@ "minimum": 1,

"$id": "#/properties/env",
"description": "Key-value map of environment variables, which should be added to the Pod spec.",
"type": "object",

@@ -333,10 +394,5 @@ "patternProperties": {

"$comment": "regex above copied from https://github.com/kubernetes/kubernetes/blob/ea0764452222146c47ec826977f49d7001b0ea8c/staging/src/k8s.io/apimachinery/pkg/util/validation/validation.go#L402",
"anyOf": [
{
"type": "string"
},
{
"type": "integer"
}
]
"not": {
"type": ["null", "object", "array"]
}
}

@@ -349,2 +405,3 @@ },

"$id": "#/properties/envSecretNames",
"description": "List of Kubernetes Secret names, used as sources for the Pod's environment variables.",
"type": "array",

@@ -359,2 +416,3 @@ "uniqueItems": true,

"$id": "#/properties/expose",
"title": "",
"type": "object",

@@ -365,2 +423,3 @@ "additionalProperties": false,

"$id": "#/properties/expose/properties/host",
"description": "Specifies the service's dns name for inbound external traffic. If it doesn't contain a dot, the default cluster domain will be appended.",
"type": "string"

@@ -370,2 +429,3 @@ },

"$id": "#/properties/expose/properties/enabled",
"description": "Expose the application to the internet.",
"type": "boolean",

@@ -378,2 +438,3 @@ "default": true

"$id": "#/properties/bindings",
"title": "Service Binding configuration",
"type": "object",

@@ -392,2 +453,3 @@ "additionalProperties": false,

"fromSecret": {
"description": "Name of a Kubernetes Secret, with the binding content, compliant to the SAP Kubernetes Service Binding spec https://github.tools.sap/Kubernetes-Service-Bindings/doc/",
"$ref": "#/definitions/KubernetesName"

@@ -402,17 +464,23 @@ }

"serviceInstanceName": {
"description": "Name of a BTP Operator Service Instance, created by the `service-instance` Helm chart. Can't be used with the `serviceInstanceFullname` option.",
"$ref": "#/definitions/KubernetesName"
},
"serviceInstanceFullname": {
"description": "Full name of a BTP Operator Service Instance. Can't be used with the `serviceInstanceName` option.",
"$ref": "#/definitions/KubernetesName"
},
"externalName": {
"description": "The name for the service binding in SAP BTP",
"type": "string"
},
"secretName": {
"description": "The name of the secret where the credentials are stored.",
"$ref": "#/definitions/KubernetesName"
},
"parameters": {
"description": "Some services support the provisioning of additional configuration parameters during the bind request. For the list of supported parameters, check the documentation of the particular service offering.",
"type": "object"
},
"parametersFrom": {
"description": "List of sources to populate parameters.",
"type": "array",

@@ -423,2 +491,3 @@ "items": {

"type": "object",
"description": "Kubernetes Secret as a parameters source.",
"additionalProperties": false,

@@ -429,7 +498,13 @@ "properties": {

"additionalProperties": false,
"required": [
"name",
"key"
],
"properties": {
"name": {
"description": "Name of a Secret.",
"$ref": "#/definitions/KubernetesName"
},
"key": {
"description": "Key in that Secret, which contains a string that represents the json to include in the set of parameters to be sent to the broker.",
"type": "string"

@@ -443,2 +518,3 @@ }

"type": "object",
"description": "Kubernetes Config Map as a parameters source.",
"additionalProperties": false,

@@ -449,7 +525,13 @@ "properties": {

"additionalProperties": false,
"required": [
"name",
"key"
],
"properties": {
"name": {
"description": "Name of a Config Map",
"$ref": "#/definitions/KubernetesName"
},
"key": {
"description": "Key in that Config Map, which contains a string that represents the json to include in the set of parameters to be sent to the broker.",
"type": "string"

@@ -465,2 +547,3 @@ }

"credentialsRotationPolicy": {
"description": "Holds automatic credentials rotation configuration. For more details, see https://github.com/SAP/sap-btp-service-operator#spec-1",
"type": "object"

@@ -496,4 +579,16 @@ }

"pattern": "^[a-z0-9]([-a-z0-9]*[a-z0-9])?(\\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*$"
}
},
"ImageRegistry": {
"type": "string",
"title": "Image registry",
"description": "Image registry e.g. docker.io",
"pattern": "^[\\w-./]+$"
},
"ImageTag": {
"title": "Image tag",
"description": "Image tag without the name (everything after the `:` sign, potentially including the `@sha256` section at the end).",
"type": "string",
"$comment": "Copied from https://github.com/containers/image/blob/18d58d29fdc4fc32fb8a8a6d186b829b217f1bf5/docker/reference/regexp.go#L68-L70",
"pattern": "^((?:(?:[a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9-]*[a-zA-Z0-9])(?:(?:\\.(?:[a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9-]*[a-zA-Z0-9]))+)?(?::[0-9]+)?/)?[a-z0-9]+(?:(?:(?:[._]|__|[-]*)[a-z0-9]+)+)?(?:(?:/[a-z0-9]+(?:(?:(?:[._]|__|[-]*)[a-z0-9]+)+)?)+)?)(?::([\\w][\\w.-]{0,127}))?(?:@([A-Za-z][A-Za-z0-9]*(?:[-_+.][A-Za-z][A-Za-z0-9]*)*[:][[:xdigit:]]{32,}))?$" }
}
}

@@ -22,2 +22,12 @@ const wrapFeature = require('../wrapFeature');

}
},
resources: {
limits: {
cpu: "2000m",
memory: "1G"
},
requests: {
cpu: "1000m",
memory: "1G"
}
}

@@ -24,0 +34,0 @@ }

@@ -25,2 +25,12 @@ const wrapFeature = require('../wrapFeature');

}
},
resources: {
limits: {
cpu: "2000m",
memory: "1G"
},
requests: {
cpu: "1000m",
memory: "1G"
}
}

@@ -27,0 +37,0 @@ }

@@ -47,3 +47,3 @@ const Context = require('./Context');

const mergeSemantics = {
existences: []
additions: []
};

@@ -97,3 +97,3 @@

const chartAdder = new ChartAdder({ builder, chartValues, chartDependencies, template: resource('helm/chart'), app: model.app, mergeSemantics });
const chartAdder = new ChartAdder({ builder, chartValues, chartDependencies, resources: resource('helm/chart'), app: model.app, mergeSemantics });
await chartAdder.execute();

@@ -100,0 +100,0 @@

@@ -61,6 +61,12 @@ const os = require("os");

*/
async sanitizeEslintConfig(configPath, customRuleExample = false, logger = console) {
async sanitizeEslintConfig(configIn, customRuleExample = false, logger = console) {
let configContents = {};
if (await fsUtil.pathExists(configPath)) {
configContents = (await this.readEslintConfig(configPath)) || {};
const isFilePath = typeof configIn === "string";
if (isFilePath) {
if (await fsUtil.pathExists(configIn)) {
console.log('+++', configIn)
configContents = (await this.readEslintConfig(configIn)) || {};
}
} else {
configContents = { ...configIn };
}

@@ -94,4 +100,6 @@ const configType = "recommended";

}
await this.writeEslintConfig(configPath, configContents, logger);
return;
if (isFilePath) {
await this.writeEslintConfig(configIn, configContents, logger);
}
return configContents;
},

@@ -98,0 +106,0 @@

@@ -1,6 +0,1 @@

const os = require("os");
const fs = require("fs");
const commandUtil = require("../init/util/commandUtil");
const path = require("path");
const glob = require("glob");
const term = require("../util/term");

@@ -11,9 +6,8 @@

const generators = require("./generators");
const {
getConfigPath,
getFileExtensions,
} = require("@sap/eslint-plugin-cds/lib/api");
const IS_WIN = os.platform() === "win32";
const ALLOWED_FILE_EXTENSIONS = getFileExtensions();
const cds = require("../../lib/cds");
const { path, exists, isdir, isfile, readdir } = cds.utils
const { exit } = require("process");
const LOG = cds.debug("lint");
const LOG_CONFIG = cds.debug("lint:config");

@@ -24,4 +18,2 @@ class Linter {

this.eslintCmdShort = "eslint";
this.eslintCmdOpts = {};
this.eslintCmdOptsIndex = 0;
this.eslintCmdFileExpr = "";

@@ -33,3 +25,2 @@ this.isFile = false;

this.configPath = "";
this.configGlobalPath = path.join(__dirname, ".eslintrc.js");
this.configContents = {};

@@ -40,4 +31,9 @@ this.projectPath = "";

this.extendsPlugin = false;
this.fileExtensions = [];
this.pluginPath = "";
this.pluginApi = {};
this.lintType = "global";
this.ruleOpts = {};
this.customRulesOpts = [];
this.pluginRules = [];
this.init = this.init.bind(this);

@@ -57,12 +53,7 @@ this.lint = this.lint.bind(this);

flags = generators.genEslintFlags(help);
[help, options, shortcuts, flags] = generators.genEslintShortcutsAndOpts(
help,
flags
);
[help, options, shortcuts, flags] = generators.genEslintShortcutsAndOpts(help, flags);
this.help = help;
this.flags = flags;
} else {
console.log(
`${term.error("Cannot call 'eslint -h', install and try again.")}\n`
);
console.log(`${term.error("Cannot call 'eslint -h', install and try again.")}\n`);
}

@@ -80,3 +71,3 @@ return { help, options, flags, shortcuts };

this.eslintCmdOpts = this._sanitizeUserOpts(options);
this.eslintCmdFileExpr = args.length ? args : ['.']
this.eslintCmdFileExpr = args.length ? args : ["."];
if (this.eslintCmdOpts.help) {

@@ -88,15 +79,8 @@ this._printHelp();

if (this.lintType === "local") {
await this._addRulePaths();
}
// Add custom rules
await this._addRulePaths();
if (this.lintType === "global") {
this._addConfig();
const overwriteRules = checks.hasEslintConfigContent(
this.configContents,
"rules"
);
if (overwriteRules) {
await this._overwriteRuleSeverities();
}
const overwriteRules = checks.hasEslintConfigContent(this.configContents, "rules");
if (overwriteRules) {
await this._overwriteRuleSeverities();
}

@@ -106,2 +90,3 @@

this._addExtensions();
// Run ESLint with collected options

@@ -117,8 +102,3 @@ try {

_printHelp() {
console.log(
this.help.replace(
/ \*([^*]+)\*/g,
` ${term.codes.bold}$1${term.codes.reset}`
)
);
console.log(this.help.replace(/ \*([^*]+)\*/g, ` ${term.codes.bold}$1${term.codes.reset}`));
return;

@@ -139,8 +119,5 @@ }

const rulesPath = path.relative(".", ".eslint/rules");
if (fs.existsSync(rulesPath) && fs.statSync(rulesPath).isDirectory()) {
this.esllintCmdOpts = io.mergeWithUserOpts(
this.eslintCmdOpts,
"rulesdir",
path.relative(".", rulesPath)
);
if (await exists(rulesPath) && await isdir(rulesPath)) {
const opts = io.mergeWithUserOpts(this.eslintCmdOpts, "rulesdir", path.relative(".", rulesPath));
this.customRulesOpts = opts.rulesdir;
await generators.genDocs(this.projectPath.replace(rulesPath, ""));

@@ -151,124 +128,134 @@ }

/**
* Searches for ESLint config file types (in order or precedence)
* and returns corresponding directory (usually project's root dir)
* https://eslint.org/docs/user-guide/configuring#configuration-file-formats
* @param {string} currentDir start here and search until root dir
* @returns {string} dir containing ESLint config file (empty if not exists)
*/
async _getConfigPath(currentDir = ".") {
const configFiles = [
".eslintrc.js",
".eslintrc.cjs",
".eslintrc.yaml",
".eslintrc.yml",
".eslintrc.json",
".eslintrc",
"package.json",
];
let configDir = path.resolve(currentDir);
while (configDir !== path.resolve(configDir, "..")) {
for (let i = 0; i < configFiles.length; i++) {
const configPath = path.join(configDir, configFiles[i]);
if (await exists(configPath) && await isfile(configPath)) {
return configPath;
}
}
configDir = path.join(configDir, "..");
}
return "";
}
_addExtensions() {
// Add CDS file extensions to lint
const extensions = ALLOWED_FILE_EXTENSIONS.map((ext) => path.extname(ext));
this.eslintCmdOpts = io.mergeWithUserOpts(
this.eslintCmdOpts,
"ext",
extensions.join(',')
);
// Enable colored output
this.eslintCmdOpts = io.mergeWithUserOpts(this.eslintCmdOpts, "color");
this.fileExtensions = this.pluginApi.getFileExtensions().map((ext) => path.extname(ext));
// Only lint file extensions prescribed by plugin
this.ignorePatterns = [];
this.fileExtensions.forEach((ext) => {
this.ignorePatterns.push(`!${ext}`);
});
return;
}
async _runEslint(dryRun = false) {
const eslintCmdOpts = [];
// Collect cmd line options
async _runEslint() {
try {
for (let key in this.eslintCmdOpts) {
let value = this.eslintCmdOpts[key];
if (this.flags.includes(`--${key}`)) {
eslintCmdOpts.push(`--${key}`);
} else {
if (value) {
eslintCmdOpts.push(`--${key}`);
// Always quote args which are not paths
if (!fs.existsSync(value)) {
eslintCmdOpts.push(`"${value}"`);
// Paths are quoted according on os
} else {
if (IS_WIN) {
eslintCmdOpts.push(`"${value}"`);
} else {
eslintCmdOpts.push(value);
}
}
}
}
const currentDir = process.cwd();
const { ESLint } = require("eslint");
let eslintOpts = {
cwd: currentDir,
extensions: this.fileExtensions,
overrideConfig: {
...this.configContents,
},
useEslintrc: false,
};
if (this.lintType === "global") {
eslintOpts = {
cwd: process.cwd(),
extensions: this.fileExtensions,
overrideConfig: {
plugins: ["@sap/eslint-plugin-cds"],
extends: "plugin:@sap/cds/recommended"
},
useEslintrc: false,
resolvePluginsRelativeTo: this.cdsdkPath
};
}
// Only lint file extensions prescribed by plugin
eslintCmdOpts.push("--ignore-pattern");
let ignorePattern = "*.*";
if (IS_WIN) {
ignorePattern = `"${ignorePattern}"`;
} else {
ignorePattern = `'${ignorePattern}'`;
if (this.customRulesOpts && this.customRulesOpts.length > 0) {
eslintOpts.rulePaths = [this.customRulesOpts]
}
eslintCmdOpts.push(ignorePattern);
ALLOWED_FILE_EXTENSIONS.forEach((ext) => {
eslintCmdOpts.push("--ignore-pattern");
ignorePattern = `!${ext}`;
if (IS_WIN) {
ignorePattern = `"${ignorePattern}"`;
} else {
ignorePattern = `'${ignorePattern}'`;
LOG_CONFIG && LOG_CONFIG(eslintOpts);
let lintString = `eslint`;
if (this.fileExtensions) { lintString += ` --ext "${this.fileExtensions.join(",")}"`};
for (const ruleOpt of Object.entries(this.ruleOpts)) {
lintString += ` --rule ${ruleOpt[0]}:${ruleOpt[1]}`;
}
if (this.customRulesOpts && this.customRulesOpts.length > 0) {
lintString += ` --rulesdir "${this.customRulesOpts}"`
};
LOG && LOG(lintString);
if (!process.env.isTest) {
const eslint = new ESLint(eslintOpts);
const formatter = await eslint.loadFormatter("stylish");
let results = (await eslint.lintText("")).map((result) => {
result.filePath = path.resolve(currentDir);
return result;
}).filter(result => result.messages.length > 0);
const files = await readdir(currentDir);
const hasFiles = files.some(file => isfile(file));
if (hasFiles) {
const resultsModel = await eslint.lintFiles(this.eslintCmdFileExpr);
results = results.concat(resultsModel);
}
eslintCmdOpts.push(ignorePattern);
});
let input = "";
if (IS_WIN) {
// Braucht es diese unterscheidung wirklich? Unterstützt Windows nicht auch single quotes?
input = `"${this.eslintCmdFileExpr.join('" "')}"`;
} else {
input = `'${this.eslintCmdFileExpr.join("' '")}'`;
if (results && results.length > 0) {
console.log(formatter.format(results));
}
}
// Run ESLint with given cmd line options
// - Show 'eslint' instead of actual eslint cmd ('node .../.eslint.js')
// - Remove debug flag for printed cmd line
if (this.eslintCmdOpts.debug) {
eslintCmdOpts.splice(eslintCmdOpts.indexOf('--debug'), 1);
console.log(
`Linting:\n ${term.info(`eslint ${eslintCmdOpts.concat(input).join(' ')}`)}\n`
);
eslintCmdOpts.push('--debug');
}
if (!dryRun) {
return commandUtil.spawnCommand(this.eslintCmd, eslintCmdOpts.concat(input), {
cwd: process.cwd(),
shell: true,
stdio: process.env.NODE_ENV === 'test' ? 'pipe' : 'inherit',
});
} else {
return Promise.resolve("");
}
} catch (err) {
return Promise.resolve(err);
console.log(err);
exit(1)
}
}
_addConfig() {
this.esllintCmdOpts = io.mergeWithUserOpts(
this.eslintCmdOpts,
"config",
path.relative(".", this.configGlobalPath)
);
return;
}
async _getConfigFileAndSetProjectPath(args) {
let input;
// Only resolve the first input argument
//
// Get config path
if (args.length > 0) {
let stats;
try {
stats = fs.statSync(args[0]);
} catch (err) {
// Do nothing
}
if (
args[0] === "." ||
(stats && (stats.isFile() || stats.isDirectory()))
) {
const firstArg = args[0];
if (firstArg === "." || (isfile(firstArg) || isdir(firstArg))) {
this.isFile = true;
}
input = glob.sync(args[0])[0];
input = args.join(" ");
if (!input) {
input = ".";
}
}
this.configPath = getConfigPath(input);
this.configPath = await this._getConfigPath(process.cwd());
if (!this.configPath) {
this.configPath = this.configGlobalPath;
this.lintType === "global";
this.configContents = await io.sanitizeEslintConfig({}, false, LOG);
} else {
this.configContents = await io.readEslintConfig(this.configPath);
const configContents = await io.readEslintConfig(this.configPath);
this.configContents = await io.sanitizeEslintConfig(configContents, false, LOG);
if (this.configContents) {

@@ -282,5 +269,10 @@ if ("extends" in this.configContents) {

this.lintType = "local";
this.pluginPath = require.resolve("@sap/eslint-plugin-cds", {
paths: [path.dirname(this.configPath)],
});
try {
this.pluginPath = require.resolve("@sap/eslint-plugin-cds", {
paths: [path.dirname(this.configPath)],
});
} catch (err) {
// Do nothing
}
this.pluginApi = require(this.pluginPath.replace("index.js", "api"));
}

@@ -294,2 +286,3 @@ }

});
this.pluginApi = require(this.pluginPath.replace("index.js", "api"));
}

@@ -302,17 +295,21 @@ // Project path is directory of ESLint config file

async _overwriteRuleSeverities() {
let configContents = (await io.readEslintConfig(this.configPath)) || {};
let configContents = {};
if (this.configPath) {
configContents = await io.readEslintConfig(this.configPath);
} else {
configContents = await io.readEslintConfig(await this._getConfigPath(this.cdsdkPath));
}
let rules = configContents.rules;
// Allow recommended plugin rules in cds-dk to be overwritten
// by user by adding rule to cmd line (because of precedence)
const pluginRules = require(this.pluginPath).rules;
const pluginRules = require(this.pluginPath).configs.recommended.rules;
if (rules && pluginRules) {
Object.keys(rules).forEach((rule) => {
if (Object.keys(pluginRules).includes(rule.replace("@sap/cds/", ""))) {
this.esllintCmdOpts = io.mergeWithUserOpts(
this.eslintCmdOpts,
"rule",
`${rule}:${rules[rule]}`
);
for (const rule of Object.keys(rules)) {
if (typeof rules[rule] !== "undefined" && rules[rule] != pluginRules[rule] ||
isfile(path.relative(".", ".eslint", "rules", rule))) {
pluginRules[rule] = rules[rule];
this.ruleOpts[rule] = rules[rule];
this.pluginRules = pluginRules;
}
});
}
}

@@ -319,0 +316,0 @@ return;

{
"name": "@sap/cds-dk",
"version": "6.0.4",
"version": "6.1.1",
"description": "Command line client and development toolkit for the SAP Cloud Application Programming Model",

@@ -19,5 +19,5 @@ "homepage": "https://cap.cloud.sap/",

"dependencies": {
"@sap/cds": "^6.0.3",
"@sap/cds": "^6.1.0",
"@sap/cds-foss": "^4",
"@sap/eslint-plugin-cds": "^2.3.3",
"@sap/eslint-plugin-cds": "^2.5.0",
"axios": ">=0.21",

@@ -27,2 +27,3 @@ "connect-livereload": "^0.6.1",

"express": "^4.17.1",
"glob": "^8.0.3",
"htmlparser2": "^8.0.0",

@@ -38,3 +39,3 @@ "livereload-js": "^3.3.1",

"optionalDependencies": {
"sqlite3": "5.0.8"
"sqlite3": "5.0.11"
},

@@ -41,0 +42,0 @@ "files": [

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc