Socket
Socket
Sign inDemoInstall

@sap/cds-dk

Package Overview
Dependencies
Maintainers
1
Versions
146
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@sap/cds-dk - npm Package Compare versions

Comparing version 7.5.1 to 7.6.0

lib/env/schemas/cds-rc.js

52

bin/add.js

@@ -13,51 +13,55 @@ const cds = require('../lib/cds')

Adds one or more features to an existing project - grow as you go.
Add one or more features to an existing project - grow as you go.
The following features are supported so far:
The following features can be added:
*hana* - add support for SAP HANA
*hana* - add support for SAP HANA
*sqlite* - add support for SQLite databases
*sqlite* - add support for SQLite
*xsuaa* - add support for authentication via XSUAA
*postgres* - add support for PostgreSQL
*multitenancy* - add support for multitenancy
*liquibase* - add support for Liquibase
*toggles* - add support for feature toggles
*xsuaa* - add support for authentication via XSUAA
*extensibility* - add support for extensibility
*multitenancy* - add support for multitenancy
*approuter* - add support for application routing
*toggles* - add support for feature toggles
*local-messaging* - add support for local messaging
*extensibility* - add support for extensibility
*file-based-messaging* - add support for file-based messaging
*approuter* - add support for application routing
*enterprise-messaging* - add support for SAP Event Mesh
*local-messaging* - add support for local messaging
*redis-messaging* - add support for Redis messaging
*file-based-messaging* - add support for file-based messaging
*kibana* - add support for Kibana formatting
*enterprise-messaging* - add support for SAP Event Mesh
*mta* - add support for MTA-based deployment
*redis-messaging* - add support for Redis messaging
*cf-manifest* - add support for CF-native deployment
*kibana* - add support for Kibana formatting
*helm* - add support for Helm-based Kyma deployment
*mta* - add support for MTA-based deployment
*html5-repo* - add support for the HTML5 repository
*cf-manifest* - add support for CF-native deployment
*pipeline* - add files for CI/CD pipeline integration
*helm* - add support for Helm-based Kyma deployment
*tiny-sample* - add minimal sample files
*html5-repo* - add support for the HTML5 repository
*sample* - add sample files including Fiori UI
*pipeline* - add files for CI/CD pipeline integration
*data* - add CSV headers for modeled entities
*tiny-sample* - add minimal sample files
*typer* - add type generation for CDS models
*sample* - add sample files including Fiori UI
*lint* - add support for CDS Lint
*data* - add CSV headers for modeled entities
*typer* - add type generation for CDS models
*lint* - add support for CDS Lint
# OPTIONS

@@ -64,0 +68,0 @@

module.exports = Object.assign(cds_bind, {
options: ['--to', '--for', '--on', '--kind', '--output-file'],
options: ['--to', '--for', '--on', '--kind', '--output-file', '--to-app-services', '--no-create-service-key'],
flags: ['--exec'],
shortcuts: ['-2', '-4', '-n', '-k', '-o'],
shortcuts: ['-2', '-4', '-n', '-k', '-o', '-a'],
help: `

@@ -24,29 +24,44 @@ # SYNOPSIS

*-2 | --to* <instance>[:<key>] | <service-binding> | <secret>
bind to a given Cloud Foundry instance, Kubernetes service binding or Kubernetes secret.
Bind to a given Cloud Foundry instance, Kubernetes service binding or Kubernetes secret.
*-4 | --for* <profile>
store binding information under <profile> in *.cdsrc-private.json*. Default *hybrid* is used
if '--for' is not specified
Profile to store binding information. Defaults to *hybrid*.
*-n | --on* cf | k8s
bind to service on Cloud Foundry or Kubernetes, defaults to Cloud Foundry
Target platform (Cloud Foundry or Kubernetes) to bind to. Defaults to *cf*.
*-k | --kind* <kind>
the kind of service
Kind of the service.
*-o | --output-file* <path>
save bindings to the given *.cdsrc.json* or *package.json* file. Default is *.cdsrc-private.json*.
If *path* is a directory, then it will save it to the *.cdsrc.json* file in that directory.
Output file for added binding information. Use this option if binding configuration should
be added to *package.json* or *.cdsrc.json*. Defaults to *.cdsrc-private.json*.
If *path* is a directory, the *.cdsrc.json* file in that directory is used.
*-a | --to-app-services <app>* (Beta)
Bind to a given application (Cloud Foundry only).
*--no-create-service-key*
Skip automatic creation of service keys.
# EXAMPLE
cds bind --to my-hdi-container
cds bind --to my-hdi-container,my-xsuaa
cds bind uaa --to my-xsuaa:my-xsuaa-key --kind xsuaa --for myprofile
cds bind --to bookshop-db
cds bind --to bookshop-db,bookshop-auth
cds bind auth --to bookshop-auth:bookshop-auth-key --kind xsuaa --for my-profile
cds bind --to my-hdi-container --output-file .
cds bind --to my-hdi-container --output-file package.json
cds bind --to bookshop-db --output-file .
cds bind --to bookshop-db --output-file package.json
cds bind --to my-hdi-container --on k8s
cds bind --to my-hdi-container --for my-profile
cds bind --to bookshop-db --on k8s
cds bind --to bookshop-db --for my-profile
cds bind --to-app-services bookshop-srv
`

@@ -76,25 +91,37 @@ });

if (!options.to) {
throw `Use option --to or -2 to specify the target instance, e.g. cds bind --to myInstance:myService`;
const { on, 'to-app-services': app, to } = options
if (on === 'k8s' && app) {
throw 'Option --to-app-services is only supported for Cloud Foundry.';
}
if (args.length > 1) {
throw `Too many arguments: Please specify only one or no service.`;
if (!to && !app) {
throw `Use option --to or -2 to specify the target instance, e.g. cds bind --to myInstance:myService`;
}
options.serviceArg = args[0]
options.targets = options.to.split(/,/g);
const outputFile = options['output-file']
if (app) {
const services = await require('../lib/util/cf').getServices(app)
// Later: optimize logging for parallelization, services.forEach(service => bind({ ...options, to: service.name, outputFile }))
for (const service of services.filter(s => Object.keys(s.credentials).length > 0)) {
await bind({ ...options, to: service.name, outputFile })
}
} else {
if (args.length > 1) {
throw `Too many arguments: Please specify only one or no service.`;
}
if (options.targets.length >= 2 && options.serviceArg) {
throw `Service argument cannot be specified together with multiple targets ('--to') services. Use one service per call or omit the service argument.`;
}
options.serviceArg = args[0]
options.targets = options.to.split(/,/g);
if (options.targets.length >= 2 && options.kind) {
throw `The option '--kind' cannot be specified together with multiple targets ('--to') services. Use one service per call or omit the '--kind' option.`;
if (options.targets.length >= 2 && options.serviceArg) {
throw `Service argument cannot be specified together with multiple targets ('--to') services. Use one service per call or omit the service argument.`;
}
if (options.targets.length >= 2 && options.kind) {
throw `The option '--kind' cannot be specified together with multiple targets ('--to') services. Use one service per call or omit the '--kind' option.`;
}
await bind({ ...options, outputFile });
}
await bind({
...options,
outputFile: options['output-file']
});
}

@@ -101,0 +128,0 @@

@@ -83,3 +83,3 @@ module.exports = Object.assign(build, {

Contents is packaged based on the rules of the 'npm pack' command.
# EXAMPLES

@@ -86,0 +86,0 @@

@@ -48,3 +48,2 @@ #!/usr/bin/env node

args[0].push(...appendArgs)
if (args[1]?.['resolve-bindings']) await _resolveBindings({ silent: cmd === 'env' })
cds.cli = {

@@ -55,2 +54,3 @@ command: cmd,

}
if (args[1]?.['resolve-bindings']) await _resolveBindings({ silent: cmd === 'env' })
return await task.apply(this, args)

@@ -57,0 +57,0 @@ }

@@ -66,8 +66,3 @@ module.exports = Object.assign (env, {

let env = cds.env
if (options["resolve-bindings"]) {
const BindingManager = require('../lib/bind/bindingManager')
const bindingManager = new BindingManager({ env, silent: true })
const bindingEnv = await bindingManager.bindingEnv()
Object.assign(process.env, bindingEnv)
}
// REVISIT: Currently only read in env.test.js
env['_home_cds-dk'] = path.resolve(__dirname, '..') // to help tools find the DK

@@ -74,0 +69,0 @@

@@ -31,4 +31,8 @@ const { URLS } = require('../lib/init/constants')

*sqlite* - add support for SQLite databases
*sqlite* - add support for SQLite
*postgres* - add support for PostgreSQL
*liquibase* - add support for Liquibase
*xsuaa* - add support for authentication via XSUAA

@@ -48,6 +52,6 @@

*file-based-messaging* - add support for file-based messaging
*redis-messaging* - add support for Redis messaging
*kibana* - add support for Kibana formatting
*mta* - add support for MTA-based deployment

@@ -54,0 +58,0 @@

@@ -47,4 +47,4 @@ module.exports = Object.assign(logout, {

Enter special mode to clear all tokens and settings for any project
folders which no longer exist in the file system. Ignores any other
command-line arguments.
folders which no longer exist in the file system. Additionally, removes
expired tokens. Ignores any other command-line arguments.

@@ -51,0 +51,0 @@ `;

@@ -41,3 +41,3 @@ module.exports = Object.assign(migrate, {

*--skip-verification
*--skip-verification*

@@ -44,0 +44,0 @@ Skip verification step. Does not check for potential deployment

@@ -109,2 +109,3 @@ /* eslint-disable no-console */

console.log ('|', v('@sap/cds-mtxs'), '|')
console.log ('|', v('@cap-js/cds-types'), '|')
function v (component) {

@@ -111,0 +112,0 @@ const version = versions [component] || MISSING

@@ -67,3 +67,10 @@ const watchOnlyOptions = ['--ext', '--livereload', '--open']

const extDefaults = 'cds,csn,csv,ts,mjs,cjs,js,json,properties,edmx,xml,env'
const ignore = RegExp(`(_out\\${sep}|node_modules\\${sep}|@cds-models\\${sep}|@types\\${sep}|app(\\${sep}.+)?\\${sep}((webapp|dist|target)\\${sep}|\\.cds-services\\.json$|tsconfig\\.json$|.*\\.tsbuildinfo$))`)
const ignore = RegExp(
`(node_modules|_out|@types|@cds-models)\\${sep}` +
`|app(\\${sep}.+)?\\${sep}(` +
`(webapp|dist|target)\\${sep}|` +
`tsconfig\\.json$|` +
`.*\\.tsbuildinfo$` +
')'
)

@@ -70,0 +77,0 @@ async function watch ([cwd], {

@@ -17,10 +17,9 @@

'enterprise-messaging-http': 'messaging',
'enterprise-messaging-shared': 'messaging'
'enterprise-messaging-shared': 'messaging',
'multitenancy': 'multitenancy'
}
const PreferredKinds = [
'hana-cloud',
'xsuaa', // REVISIT: remove with ^8
'xsuaa-auth'
];
const PreferredKinds = {
multitenancy:1, 'hana-cloud':1, xsuaa:1, 'xsuaa-auth':1, 'ias-auth':1, 'saas-registry':1
};

@@ -68,25 +67,20 @@ const PLATFORM_TYPES = {

function findKindFor(kind, service) {
const requires = env.requires.kinds || {};
for (const [kind, service] of Object.entries(requires)) {
const vcapService = env._find_credentials_for_required_service(kind, service, vcapServices);
if (!vcapService) return;
if (!Object.values(requires).find(require => require !== service && require.kind == kind))
vcapService.service.kindCandidates.push(kind);
if (!vcapService) continue;
const isUniqueKind = !Object.values(requires).some(otherService =>
otherService !== service && otherService.kind === kind
)
if (isUniqueKind) vcapService.service.kindCandidates.push(kind);
}
const requires = env.requires.kinds || {};
for (const kind of Object.keys(requires)) {
const requireService = requires[kind];
findKindFor(kind, requireService);
findKindFor(kind, { kind });
}
// Check if one "kind" per service was found
for (const service of services) {
const preferredKind = service.kindCandidates.find(kind => PreferredKinds.indexOf(kind) >= 0);
const preferredKind = service.kindCandidates.find(kind => kind in PreferredKinds);
if (preferredKind) {
service.kind = preferredKind;
} if (service.kindCandidates.length === 1) {
} else if (service.kindCandidates.length === 1) {
service.kind = service.kindCandidates[0];
}
delete service.kindCandidates;

@@ -93,0 +87,0 @@ }

@@ -11,3 +11,3 @@ const cds = require('../cds');

class CFKeyProvider {
module.exports = new class CFKeyProvider {
async getCfSpaceInfo() {

@@ -97,6 +97,7 @@ DEBUG?.('getting space info');

async _resolveManagedService(instanceObj, instance, key, org, space) {
let keyObj
try {
keyObj = await this.getKey({ names: key, service_instance_guids: instanceObj.guid });
} catch (error) {
const create = (cds.cli.command === 'bind' || cds.cli.command === 'deploy') && !cds.cli.options['no-create-service-key']
const silent = (cds.cli.options.json || cds.cli.command === 'env') && !DEBUG
const credentials = await cfUtil.getOrCreateServiceKey(instanceObj, key, {}, { create, silent });
if (!credentials) {
let message = `No service key ${highlight(key)} found for service instance ${highlight(instance)}.\n\n`;

@@ -106,5 +107,5 @@ message += `Use ${bold(`cf create-service-key ${instanceObj.name} ${key} [-c ...]`)} to create the required service key.`;

}
const planObj = await this.getPlans({ guids: instanceObj.relationships.service_plan.data.guid });
const offeringObj = await this.getOfferings({ guids: planObj.relationships.service_offering.data.guid });
const keyCredentialsObj = await BatchRequest.req(this.target, `/v3/service_credential_bindings/${encodeURIComponent(keyObj.guid)}/details`);
const resolvedBinding = {

@@ -125,3 +126,3 @@ binding: {

},
credentials: keyCredentialsObj.credentials
credentials
};

@@ -244,3 +245,2 @@

DEBUG?.(`_req: ${url.toString()}`);
const result = await axios.get(url.toString(), {

@@ -254,3 +254,1 @@ headers: {

}
module.exports = new CFKeyProvider()

@@ -36,3 +36,3 @@ const cds = require('../cds');

const onText = { 'cf': 'Cloud Foundry', 'k8s': 'Kubernetes' }[on];
logger.log(`Retrieving data from ${onText}...`);
if (!options['to-app-services']) logger.log(`Retrieving data from ${onText}...`);
let resolvedServices

@@ -98,3 +98,5 @@

const isJavaProject = await checkIsJavaProject(cds.root);
logger.log(`${info('TIP:')} Run with cloud bindings: ${bold(runCmd(options.for, isJavaProject))}`);
if (!cds.cli.options['to-app-services']) {
logger.log(`${info('TIP:')} Run with cloud bindings: ${bold(runCmd(options.for, isJavaProject))}`);
}
}

@@ -101,0 +103,0 @@

@@ -5,4 +5,4 @@ const fs = require('fs')

const BuildTaskProviderFactory = require('./buildTaskProviderFactory')
const { hasJavaNature, getProperty, flatten, getDefaultModelOptions } = require('./util')
const { FILE_EXT_CDS, BUILD_TASK_JAVA_CF, BUILD_TASK_JAVA } = require("./constants")
const { hasJavaNature, getProperty, flatten, getDefaultModelOptions, hasOptionValue } = require('./util')
const { FILE_EXT_CDS, BUILD_TASK_JAVA_CF, BUILD_TASK_JAVA, CONTENT_WS } = require("./constants")
const term = require('../util/term')

@@ -59,3 +59,5 @@ const DEBUG = cds.debug('cli|build')

tasks = await this.providerFactory.lookupTasks()
this._applyCliOptions(tasks)
} else {
this._applyCliOptions(tasks)
// 1. apply default values including task.for and task.use and ensure that for all tasks a provider exists - throwing error otherwise

@@ -67,16 +69,11 @@ await this.providerFactory.applyTaskDefaults(tasks)

// 3. filters the list of build tasks and adapts according to given CLI options
// 2. filters the list of build tasks
// Note: A new task might get created, e.g. 'cds build --for hana' will enforce a hana build even if sqlite has been configured
let existingTasks = tasks
tasks = this._applyCliOptionsFilter(tasks)
tasks = await this._filterTasksForCli(tasks)
if (tasks.length === 0) {
return tasks
}
// a new array is returned to indicate new build tasks have been created
if (tasks !== existingTasks) {
// a different task shall be executed
await this.providerFactory.applyTaskDefaults(tasks)
}
// 2. add dependencies
// 3. add dependencies
existingTasks = [...tasks]

@@ -91,3 +88,3 @@ await this.providerFactory.lookupTasks(tasks, true)

// obligatory task defaults shared by all tasks
BuildTaskFactory._applyCommonTaskDefaults(tasks)
await BuildTaskFactory._applyCommonTaskDefaults(tasks)

@@ -102,7 +99,22 @@ this._setDefaultBuildTargetFolder(tasks)

static _applyCommonTaskDefaults(tasks) {
const modelPaths = getDefaultModelOptions()
static async _applyCommonTaskDefaults(tasks) {
// normalize model options
tasks.forEach(task => {
this._setTaskModelOptions(task, modelPaths)
if (task.options?.model && !Array.isArray(task.options.model)) {
task.options.model = [task.options.model]
}
})
// there must be at least one task without model options
if (!tasks.some(task => !task.options?.model?.length > 0)) {
return
}
const modelPaths = await getDefaultModelOptions()
let wsModelPaths
// calculate only once
if (tasks.some(task => hasOptionValue(task.options?.[CONTENT_WS], true))) {
wsModelPaths = await getDefaultModelOptions(true)
}
// set default model options
tasks.forEach(task => {
this._setTaskModelOptions(task, hasOptionValue(task.options?.[CONTENT_WS], true) ? wsModelPaths : modelPaths)
if (!task.src) {

@@ -142,3 +154,3 @@ throw new Error(`Invalid build task definition - value of property 'src' is missing in [${task.for || task.use}].`)

_applyCliOptionsFilter(tasks) {
async _filterTasksForCli(tasks) {
const options = this.options

@@ -158,2 +170,4 @@ // filter tasks using either option for, use, src

resultTasks.push(task)
this._applyCliOptions(resultTasks)
await this.providerFactory.applyTaskDefaults(resultTasks)
}

@@ -166,3 +180,2 @@ } else if (resultTasks.length <= tasks.length) {

}
this._applyCliOptions(resultTasks)
return resultTasks

@@ -187,3 +200,3 @@ }

task.options = task.options || {}
if (!task.options.model || Array.isArray(task.options.model) && task.options.model.length === 0) {
if (!task.options.model?.length > 0) {
defaultModelPaths = new Set(defaultModelPaths)

@@ -194,4 +207,2 @@ if (task.src) {

task.options.model = [...defaultModelPaths]
} else if (!Array.isArray(task.options.model)) {
task.options.model = [task.options.model]
}

@@ -198,0 +209,0 @@ }

@@ -29,9 +29,10 @@ exports.OUTPUT_MODE = "outputMode"

exports.CONTENT_PACKAGE_JSON = "contentPackageJson" // create package.json file if not existing, or modify existing package.json - ENABLED by default
exports.CONTENT_PACKAGELOCK_JSON = "contentPackageLockJson" // copy package-lock.json file if existing into deployment folder - ENABLED by default
exports.CONTENT_HDBTABLEDATA = "contentHdbtabledata" // create .hdbtabledata files for .csv files if not existing - ENABLED by default
exports.CONTENT_NPMRC = "contentNpmrc" // copy .npmrc file if existing into deployment folder - ENABLED by default
exports.CONTENT_CDSRC_JSON = "contentCdsrcJson" // copy .cdsrc.json file if existing into deployment folder - ENABLED by default
exports.CONTENT_NODE_MODULES = "contentNodeModules" // copy node_modules folder if existing into deployment folder - DISABLED by default
exports.CONTENT_ENV = "contentEnv" // copy .env file if existing into deployment folder - DISABLED by default
exports.CONTENT_DEFAULT_ENV_JSON = "contentDefaultEnvJson" // copy default-env.json file if existing into deployment folder - DISABLED by default
exports.CONTENT_PACKAGELOCK_JSON = "contentPackageLockJson" // copy package-lock.json file into deployment folder - ENABLED by default
exports.CONTENT_HDBTABLEDATA = "contentHdbtabledata" // create .hdbtabledata files for .csv files if not already existing - ENABLED by default
exports.CONTENT_NPMRC = "contentNpmrc" // copy .npmrc file into deployment folder - ENABLED by default
exports.CONTENT_CDSRC_JSON = "contentCdsrcJson" // copy .cdsrc.json file into deployment folder - ENABLED by default
exports.CONTENT_NODE_MODULES = "contentNodeModules" // copy node_modules folder into deployment folder - DISABLED by default
exports.CONTENT_ENV = "contentEnv" // copy .env file into deployment folder - DISABLED by default
exports.CONTENT_DEFAULT_ENV_JSON = "contentDefaultEnvJson" // copy default-env.json file into deployment folder - DISABLED by default
exports.CONTENT_WS = "ws" // determine model paths including submodules + CONTENT_SUBMODULES_HANA - DISABLED by default

@@ -47,3 +48,3 @@ exports.CSV_FILE_TARGET = "csvFileTarget" // target folder when copying CSV files to the deployment target folder ./db/src/gen/*, default is 'data'

exports.FILE_EXT_CDS = ".cds"
exports.MTX_SIDECAR_FOLDER = "mtx/sidecar" // default name of the mtx sidecar folder
exports.MTX_SIDECAR_FOLDER = "mtx/sidecar" // default name of the mtx sidecar folder
exports.DEFAULT_CSN_FILE_NAME = "csn.json"

@@ -50,0 +51,0 @@ // REVISIT: the models are not required if a custom server.js file is used for MTX bootstrap

@@ -1,2 +0,1 @@

const fs = require('fs')
const path = require('path')

@@ -68,3 +67,3 @@ const cds = require('../../cds')

async _resolveSourcePaths(csn) {
const regex = new RegExp(path.resolve(cds.root, this.ftsName).replace(/\\/g, '\\\\') + '[/|\\\\](?<ftName>[^/|\\\\]*)')
const regex = new RegExp(this.ftsName + '[/|\\\\](?<ftName>[^/|\\\\]*)')
let paths = { base: [] }

@@ -81,7 +80,2 @@

// keep existing behavior and return paths returned by cds.resolve if features are not supported by this project
if (!fs.existsSync(path.join(cds.root, this.ftsName))) {
return paths
}
// add source file paths for the features

@@ -88,0 +82,0 @@ paths.features = csn['$sources'].reduce((acc, file) => {

@@ -6,5 +6,6 @@ const fs = require('fs')

const InternalBuildPlugin = require('../internalBuildPlugin')
const { BuildError, relativePaths, BuildMessage } = require('../../util')
const { BuildError, relativePaths, BuildMessage, getWorkspaces, hasOptionValue } = require('../../util')
const { OUTPUT_MODE_NONE, OUTPUT_MODE, CONTENT_PACKAGE_JSON, CONTENT_HDBTABLEDATA, CSV_FILE_DETECTION,
CONTENT_ENV, CONTENT_DEFAULT_ENV_JSON, CONTENT_NODE_MODULES, OUTPUT_MODE_RESULT, CONTINUE_UNRESOLVED_SCHEMA_CHANGES, CSV_FILE_TARGET } = require('../../constants')
CONTENT_ENV, CONTENT_DEFAULT_ENV_JSON, CONTENT_NODE_MODULES, OUTPUT_MODE_RESULT, CONTINUE_UNRESOLVED_SCHEMA_CHANGES,
CSV_FILE_TARGET, CONTENT_WS } = require('../../constants')
const { WARNING } = InternalBuildPlugin

@@ -41,3 +42,3 @@ const DEFAULT_COMPILE_DEST_FOLDER = path.normalize("src/gen")

}
// the order is important
// the order of 1 and 2 is important
// 1. compile

@@ -53,4 +54,8 @@ const hdiPlugins = await this._compileToHana(model)

}
// copy native hana artifacts from workspace dependencies
if (hasOptionValue(this.task.options?.[CONTENT_WS], true)) {
await this._addWorkspaceContent()
}
// 3. create additional stuff in dest
// create additional stuff in dest
await this._writeHdiConfig(hdiPlugins)

@@ -119,3 +124,2 @@ await this._writeHdiNamespace()

// 2. staging build: copy files except *.cds, .env, default-env.json, ./node_modules/**
if (this.isStagingBuild()) {

@@ -127,5 +131,5 @@ let blockList = "\\.cds$|\\.csv$|\\.hdbtabledata$"

// 2. staging build: copy files except *.cds, .env, default-env.json, ./node_modules/**
await this.copyNativeContent(src, dest, (entry) => {
if (entry.startsWith(dbSrc)) {
// entire native content
return true

@@ -386,7 +390,7 @@ }

async _writePackageJson() {
const packageJson = path.join(this.task.src, "package.json")
const exists = fs.existsSync(packageJson)
const pkgJson = path.join(this.task.src, "package.json")
const exists = fs.existsSync(pkgJson)
if (exists) {
DEBUG?.(`skip create [${relativePaths(cds.root, packageJson)}], already existing`)
DEBUG?.(`skip create [${relativePaths(cds.root, pkgJson)}], already existing`)
}

@@ -541,2 +545,33 @@ if (this.isStagingBuild() && !exists) {

/**
* Copy native hana content from workspaces.
*/
async _addWorkspaceContent() {
const srcPaths = new Set()
const workspaces = await getWorkspaces()
workspaces.forEach(workspace => {
const dbSrc = path.join(cds.root, workspace, 'db/src')
const dbCfg = path.join(cds.root, workspace, 'db/cfg')
if (fs.existsSync(dbSrc)) {
srcPaths.add(dbSrc)
}
if (fs.existsSync(dbCfg)) {
srcPaths.add(dbCfg)
}
})
DEBUG?.(`Copying hana native artifacts from folders '${[...srcPaths].map(f => path.relative(cds.root, f)).join(', ')}'`)
const dest = path.join(this.task.dest, 'src/gen')
for (const src of srcPaths) {
await this.copyNativeContent(src, dest, (entry) => {
if (entry.match(/(\/|\\)gen(\/|\\)?$/)) {
// skip gen folder content
return false
}
return true
})
}
}
static async _readTemplateAsJson(template) {

@@ -543,0 +578,0 @@ const content = await fs.promises.readFile(path.join(__dirname, 'template', template), 'utf-8')

@@ -8,3 +8,3 @@ const fs = require('fs')

BUILD_TASK_PREFIX, BUILD_TASKS, BUILD_TASK_MTX_SIDECAR, MTX_SIDECAR_FOLDER, BUILD_TASK_MTX_EXTENSION, NODEJS_MODEL_EXCLUDE_LIST,
IGNORE_DEFAULT_MODELS } = require("../constants")
IGNORE_DEFAULT_MODELS, CONTENT_WS } = require("../constants")
const DEBUG = cds.debug('cli|build')

@@ -40,3 +40,3 @@

InternalBuildTaskProvider._setDefaultModel(task)
await InternalBuildTaskProvider._setDefaultModel(task)
}

@@ -295,3 +295,4 @@

static _setDefaultModel(task) {
static async _setDefaultModel(task) {
const ws = hasOptionValue(task.options?.[CONTENT_WS], true)
let taskModelPaths = task.options?.model

@@ -307,6 +308,7 @@ if (taskModelPaths && !Array.isArray(taskModelPaths)) {

if (!hasOptionValue(task.options?.[IGNORE_DEFAULT_MODELS], true)) {
defaultModelPaths = getDefaultModelOptions().filter(p => p.match(allowList))
defaultModelPaths = await getDefaultModelOptions(ws)
defaultModelPaths = defaultModelPaths.filter(p => p.match(allowList))
}
} else {
defaultModelPaths = getDefaultModelOptions()
defaultModelPaths = await getDefaultModelOptions(ws)
defaultModelPaths.push(task.src)

@@ -313,0 +315,0 @@ if (hasOptionValue(task.options?.[IGNORE_DEFAULT_MODELS], true)) {

@@ -93,9 +93,3 @@ const fs = require('fs')

}
_isCompilerV1() {
const version = cds.compiler.version()
const match = version.match(/(\d+)\.?(\d*)\.?(\d*)/)
return match && match[1] === 1
}
}
module.exports = JavaBuildPlugin

@@ -53,2 +53,7 @@ const path = require('path')

await this.copySrvContent(this.task.src, destSidecar, destSidecar)
// copy .npmrc from project root, if it exists and no .npmrc exists in the sidecar
if (!fs.existsSync(path.join(destSidecar, '.npmrc')) && fs.existsSync(path.join(cds.root, '.npmrc'))) {
await this.copy(path.join(cds.root, '.npmrc')).to(path.join(destSidecar, '.npmrc'))
}
}

@@ -55,0 +60,0 @@

@@ -6,3 +6,3 @@ const fs = require('fs')

const EdmxBuildPlugin = require('../edmxBuildPlugin')
const { BuildError } = require('../../util')
const { BuildError, getWorkspaces } = require('../../util')
const { OUTPUT_MODE, OUTPUT_MODE_FILESYSTEM, ODATA_VERSION_V2, FOLDER_GEN, CONTENT_EDMX, CONTENT_PACKAGELOCK_JSON,

@@ -40,23 +40,22 @@ CONTENT_NPMRC, CONTENT_CDSRC_JSON, CONTENT_ENV, CONTENT_DEFAULT_ENV_JSON, FLAVOR_LOCALIZED_EDMX } = require('../../constants')

const model = await this.model()
if (!model) {
if (!model && !this.context.options['ws']) {
return
}
if (model) {
const { dictionary, sources } = await this.compileAll(model, destSrv, destRoot)
// collect and write language bundles into single i18n.json file
await this.collectAllLanguageBundles(dictionary, sources, destSrv, destRoot)
const { dictionary, sources } = await this.compileAll(model, destSrv, destRoot)
// collect and write language bundles into single i18n.json file
await this.collectAllLanguageBundles(dictionary, sources, destSrv, destRoot)
if (!this.hasBuildOption(CONTENT_EDMX, false)) {
const compileOptions = { [FLAVOR_LOCALIZED_EDMX]: this.hasBuildOption(FLAVOR_LOCALIZED_EDMX, true) }
// inferred flavor is required by edmx compiler backend
// using cds.compile instead of cds.compiler.compileSources ensures that cds.env options are correctly read
const baseModel = dictionary.base.meta.flavor !== 'inferred' ? await cds.compile(sources.base, super.options(), 'inferred') : dictionary.base
await this.compileToEdmx(baseModel, path.join(this.destSrv, 'odata', cds.env.odata.version), compileOptions)
if (!this.hasBuildOption(CONTENT_EDMX, false)) {
const compileOptions = { [FLAVOR_LOCALIZED_EDMX]: this.hasBuildOption(FLAVOR_LOCALIZED_EDMX, true) }
// inferred flavor is required by edmx compiler backend
// using cds.compile instead of cds.compiler.compileSources ensures that cds.env options are correctly read
const baseModel = dictionary.base.meta.flavor !== 'inferred' ? await cds.compile(sources.base, super.options(), 'inferred') : dictionary.base
await this.compileToEdmx(baseModel, path.join(this.destSrv, 'odata', cds.env.odata.version), compileOptions)
}
}
if (this.isStagingBuild() && this.hasBuildOption(OUTPUT_MODE, OUTPUT_MODE_FILESYSTEM)) {
const srcSrv = this.task.src === cds.root ? path.resolve(this.task.src, cds.env.folders.srv) : this.task.src
await this._copyNativeContent(cds.root, srcSrv, destRoot, destSrv)
if (this.context.options.ws) {
if (this.context.options['ws']) {
await this._addWorkspaceDependencies(destRoot)

@@ -165,28 +164,14 @@ }

async _addWorkspaceDependencies(dest) {
const wsRoot = this._findWorkspaceRoot(cds.root)
const wsRoot = NodejsBuildPlugin._findWorkspaceRoot(cds.root)
const appPkg = require(path.join(dest, 'package.json'))
let workspaces
if (this.task.options?.workspaces) {
if (this.task.options.workspaces) {
workspaces = Array.isArray(this.task.options.workspaces) ? this.task.options.workspaces : [this.task.options.workspaces]
} else {
workspaces = await getWorkspaces(true, false, wsRoot);
}
if (!workspaces) {
workspaces = []
for (const name in appPkg.dependencies) {
if (appPkg.dependencies[name] === '*') {
try {
const pkg = require.resolve(path.join(name, 'package.json'), { paths: [wsRoot] })
workspaces.push(path.dirname(path.relative(wsRoot, pkg)))
} catch(e) {
if (e.code === 'MODULE_NOT_FOUND') {
throw new BuildError(`Workspace packaging failed - module '${name}' not found. Make sure that configured npm workspaces are installed.`)
}
throw e
}
}
}
}
if (workspaces.length) {
let pkgDescriptors
try {
pkgDescriptors = await this._execNpmPack(dest, wsRoot, workspaces)
pkgDescriptors = await NodejsBuildPlugin._execNpmPack(dest, wsRoot, workspaces)
} catch (e) {

@@ -199,5 +184,5 @@ DEBUG?.(e)

for (const pkgDescriptor of pkgDescriptors) {
const {name, filename: fileName} = pkgDescriptor
const filePath = path.join(dest, fileName)
if (appPkg.dependencies?.[name] === '*') {
const { name, filename: fileName } = pkgDescriptor
const filePath = path.join(dest, fileName)
if (appPkg.dependencies?.[name]) {
appPkg.dependencies[name] = `file:${fileName}`

@@ -217,11 +202,11 @@ this.pushFile(filePath)

_findWorkspaceRoot(dir) {
static _findWorkspaceRoot(dir) {
try {
if(require(path.join(dir, 'package.json')).workspaces) {
return dir
if (require(path.join(dir, 'package.json')).workspaces) {
return dir
}
} catch(err) {
} catch (err) {
// ignore
}
if(fs.existsSync(path.join(dir, '.gitmodules')) || fs.existsSync(path.join(dir, 'node_modules'))) {
if (fs.existsSync(path.join(dir, '.gitmodules')) || fs.existsSync(path.join(dir, 'node_modules'))) {
return // project root reached

@@ -233,9 +218,9 @@ }

}
async _execNpmPack(dest, wsRoot, workspaces) {
static async _execNpmPack(dest, wsRoot, workspaces) {
const args = ['pack', '-ws', '--json', `--pack-destination=${dest}`]
workspaces.forEach(w => args.push(`-w=${w}`))
DEBUG?.(`execute command: npm ${args}`)
const result = await cmd.spawnCommand( 'npm', args, { cwd: wsRoot }, true, !!DEBUG)
DEBUG?.(`execute command: npm ${args.join(', ')}`)
const result = await cmd.spawnCommand('npm', args, { cwd: wsRoot }, true, !!DEBUG)
DEBUG?.(result)

@@ -242,0 +227,0 @@ return JSON.parse(result)

const fs = require('fs')
const path = require('path')
const cds = require('../cds')
const { SEVERITY_ERROR, FILE_EXT_CDS } = require('./constants')
const cp = require('child_process');
const execAsync = require('util').promisify(cp.exec)
const IS_WIN = require('os').platform() === 'win32'
const { SEVERITY_ERROR, FILE_EXT_CDS } = require('./constants');
const DEBUG = cds.debug('cli|build')

@@ -116,6 +120,22 @@ function getProperty(src, segments) {

function getDefaultModelOptions() {
const fts = cds.env.features.folders
const modelPaths = cds.resolve(!fts ? '*' : ['*', fts], false)
return _pushModelPaths(modelPaths)
async function getDefaultModelOptions(ws = false) {
const wildcards = cds.env.requires.toggles && cds.env.features.folders ? ['*', cds.env.features.folders] : '*'
let paths = _pushModelPaths(cds.resolve(wildcards, false))
if (ws) {
const workspaces = await getWorkspaces()
const env = cds.env
const root = cds.root
try {
workspaces.forEach(workspace => {
cds.root = path.join(root, workspace)
cds.env = cds.env.for('cds', cds.root)
paths = paths.concat(_pushModelPaths(cds.resolve(cds.env.requires.toggles ? wildcards : '*', false)).map(p => path.join(workspace, p)))
})
} finally {
cds.root = root
cds.env = env
}
}
return paths
}

@@ -131,26 +151,2 @@

function _pushModelPaths(...modelPaths) {
const model = new Set()
// may contain nested arrays
modelPaths = flatten(modelPaths)
const { roots } = cds.env
modelPaths.forEach(m => {
if (m && !model.has(m) && !model.has(m + "/")) {
// filter root model paths that do not exist
// other entries are added as is, e.g. reuse model entries
if (roots.includes(m)) {
const dir = path.resolve(cds.root, m)
if (fs.existsSync(dir)) {
model.add(m)
} else if (fs.existsSync(dir + FILE_EXT_CDS)) { //might be cds file name, compatibility to old build configs
model.add(m)
}
} else {
model.add(m)
}
}
})
return [...model].map(m => normalizePath(m))
}
function flatten(modelPaths) {

@@ -185,4 +181,112 @@ return modelPaths.reduce((acc, m) => {

async function pathExists(path) {
return fs.promises.access(path).then(() => true).catch(() => false);
}
/**
* Returns the list of workspaces defined for the current project. A workspace dependency
* needs to be defined resulting in a corresponding sym-link.
* @returns the paths are relative to 'cds.root'.
*/
async function getWorkspaces(resolve = false, devDependencies = true, wsRoot = cds.root) {
const workspaces = []
const pgkJsonWsRoot = require(path.join(wsRoot, 'package.json'))
let pgkJson = pgkJsonWsRoot
if (wsRoot !== cds.root) {
pgkJson = require(path.join(cds.root, 'package.json'))
}
if (pgkJsonWsRoot.workspaces) {
if (resolve) {
_validateWsDependencies(wsRoot, pgkJson.dependencies, devDependencies ? pgkJson.devDependencies : {})
}
// read npm workspaces
const pkgs = await _execNpmWs(wsRoot)
pkgs.forEach(pkg => {
if (!resolve || devDependencies && pgkJson.devDependencies?.[pkg] || pgkJson.dependencies?.[pkg]) {
const workspace = _getWorkspace(wsRoot, pkg)
if (workspace) {
workspaces.push(workspace)
}
}
})
}
DEBUG?.(`Found ${workspaces.length} workspaces ${workspaces.join(', ')}`)
return workspaces
}
/**
* Read installed npm workspaces using 'npm ls -ws --json'.
* @returns {Array} list of package paths, or an empty array if either
* no workspaces are defined or have not been installed.
*/
async function _execNpmWs(wsRoot) {
try {
const cmdLine = 'npm ls -ws --json'
DEBUG?.(`execute ${cmdLine}`)
let result = await execAsync(cmdLine, { shell: IS_WIN, stdio: ['inherit', 'pipe', 'inherit'], cwd: wsRoot })
// DEBUG?.(result.stdout)
result = JSON.parse(result.stdout)
if (result.dependencies) {
return Object.keys(result.dependencies)
} else if (result.name) {
return [result.name]
}
} catch (e) {
e.stderr = e.stderr?.replace(/npm ERR! /g, '')
throw e
}
return []
}
function _getWorkspace(wsRoot, pkg) {
try {
const pkgPath = require.resolve(path.join(pkg, 'package.json'), { paths: [wsRoot] })
if (!pkgPath.match(/(\/|\\)node_modules(\/|\\)?/) && pkgPath.startsWith(wsRoot)) {
return path.relative(wsRoot, path.dirname(pkgPath))
}
} catch (e) {
// ignore - workspace dependencies have already been validated
}
}
function _validateWsDependencies(wsRoot, dependencies, devDependencies) {
const allDependencies = {...dependencies, ...devDependencies}
for (const name in allDependencies) {
if (allDependencies[name] === '*') {
try {
require.resolve(path.join(name, 'package.json'), { paths: [wsRoot] })
} catch (e) {
if (e.code === 'MODULE_NOT_FOUND') {
throw new BuildError(`npm packaging failed, module '${name}' not found. Make sure configured npm workspaces are installed.`)
}
throw e
}
}
}
}
function _pushModelPaths(...modelPaths) {
const model = new Set()
// may contain nested arrays
modelPaths = flatten(modelPaths)
const { roots } = cds.env
modelPaths.forEach(m => {
if (m && !model.has(m) && !model.has(m + "/")) {
// filter roots and absolute model paths that do not exist
if (roots.includes(m)) {
const dir = path.resolve(cds.root, m)
if (fs.existsSync(dir)) {
model.add(m)
} else if (fs.existsSync(dir + FILE_EXT_CDS)) { //might be cds file name, compatibility to old build configs
model.add(m)
}
} else {
model.add(m)
}
}
})
return [...model].map(m => normalizePath(m))
}
/**
* Return gnu-style error string for location `loc`:

@@ -214,6 +318,2 @@ * - 'File:Line:Col' without `loc.end`

async function pathExists(path) {
return fs.promises.access(path).then(() => true).catch(() => false);
}
/**

@@ -283,4 +383,5 @@ * Class for individual build message.

pathExists,
getWorkspaces,
BuildMessage,
BuildError
}

@@ -162,3 +162,3 @@ const cp = require('child_process');

const cfServiceInstanceKeyName = `${cfServiceInstanceName}-key`;
let serviceKey = await cfUtil.getOrCreateServiceKey(serviceInstance, cfServiceInstanceKeyName);
let serviceKey = await cfUtil.getOrCreateServiceKey(serviceInstance, cfServiceInstanceKeyName, { permissions: 'development' });
this._validateServiceKey(serviceKey, cfServiceInstanceKeyName);

@@ -165,0 +165,0 @@

@@ -1,16 +0,107 @@

const { merge } = require('../init/merge')
const { readJSON } = require('../util/fs')
module.exports = async function overlay4(schemaName, cwd = process.cwd()) {
function _local(id) {
return require.resolve(id, {
paths: [
process.cwd(), // project-local module is preferred
__dirname // otherwise from our own dependencies
]
})
}
// need to load @sap/cds from current cwd which might
// differ from initial cwd to get project specific cds version
const cds = require(require.resolve('@sap/cds', {
paths: [cwd, __dirname]
}))
function _robustRequire(file) {
const fs = require('fs')
const path = require('path')
// force cds.root to be project root
cds.root = cwd
file = path.normalize(file)
let content
try {
content = fs.readFileSync(file, 'utf8')
} catch (e) {
// file is not readable -> nothing to do
return
}
// load defaults, throw err if schema not found
try {
return JSON.parse(content)
} catch (e) {
// invalid json -> ignore and try regex
}
const result = {}
try {
const deps = JSON.parse(content.match(/"dependencies"\s*:\s*({[^}]*})/)[1])
result.dependencies = deps
} catch (e) { /* ignore */ }
try {
const devDeps = JSON.parse(content.match(/"devDependencies"\s*:\s*({[^}]*})/)[1])
result.devDependencies = devDeps
} catch (e) { /* ignore */ }
return result
}
_robustRequire.resolve = function (id, options = {}) {
const fs = require('fs')
const path = require('path')
try {
return require.resolve(id, options)
} catch (err) {
// ignore
}
const [, mod, file] = /(.+)\/([^/]+)$/.exec(id)
let modPath = path.join(process.cwd(), 'node_modules', mod, file)
const ext = path.extname(modPath)
if (!ext) {
modPath = modPath + '.js'
}
if (fs.existsSync(modPath)) {
return modPath
}
throw new Error('not found')
}
function mergePluginSchema(pluginSchema) {
return {
into: async (targetSchema) => {
if (pluginSchema.buildTaskType) {
targetSchema.$defs?.buildTaskType?.enum?.push(pluginSchema.buildTaskType.name);
targetSchema.$defs?.buildTaskType?.enumDescriptions?.push(pluginSchema.buildTaskType.description);
targetSchema._cds_schema_overlays.push('buildTaskType')
}
if (pluginSchema.databaseType) {
targetSchema.$defs?.databaseType?.enum?.push(pluginSchema.databaseType.name);
targetSchema.$defs?.databaseType?.enumDescriptions?.push(pluginSchema.databaseType.description);
targetSchema._cds_schema_overlays.push('databaseType')
}
if (pluginSchema.cds) {
if (targetSchema.$defs?.cdsRoot?.properties) {
Object.assign(targetSchema.$defs.cdsRoot.properties, pluginSchema.cds)
targetSchema._cds_schema_overlays.push('cdsRoot')
}
// backwards compatibility
if (targetSchema.properties) {
Object.assign(targetSchema.properties, pluginSchema.cds)
targetSchema._cds_schema_overlays.push('root')
}
}
}
}
}
module.exports = async function overlay4(schemaName) {
const path = require('path')
const { merge } = require('../init/merge')
const cds = require('../cds')
cds.root = process.cwd()
const { readJSON } = require('../util/fs')
let result;

@@ -20,17 +111,66 @@ if (typeof cds.schema?.default4 === 'function') {

} else if (cds.env?.schemas?.[schemaName]) {
// backwards compatibility
// backwards compatibility, fails for invalid json files
result = await readJSON(cds.env.schemas[schemaName])
} else {
throw new Error(`cds ${cds.version} in ${cwd} does not support schema retrieval.`)
throw new Error(`cds ${cds.version} in ${cds.root} does not support schema retrieval.`)
}
// optional, don't throw if file not found
if (schemaName !== 'cds-rc.json') {
return result
}
const oldTitle = result.title
const oldDescription = result.description
result._cds_schema_overlays = []
try {
await merge(__dirname, 'schemas', schemaName).into(result)
result.description = result.description + ' - overlaid'
const scheamFile = path.join(__dirname, 'schemas', schemaName.replace(/\.json$/, '.js'))
const schema = require(scheamFile)
await merge(schema).into(result)
result._cds_schema_overlays.push('cdsRoot')
if (result.properties) {
// backwards compatibility
await merge(schema.$defs.cdsRoot.properties).into(result.properties)
result._cds_schema_overlays.push('root')
}
} catch (err) {
if (err.code !== 'ENOENT') throw err
console.error(err.message)
}
try {
const pluginsLib = require(_local('@sap/cds/lib/plugins.js'))
if (pluginsLib) {
const oldRequire = pluginsLib.require
try {
pluginsLib.require = _robustRequire
const plugins = pluginsLib.fetch?.()
if (plugins) {
for (const [key, { impl, packageJson }] of Object.entries(plugins)) {
const pluginPackageJsonFile = packageJson || path.join(path.dirname(impl), 'package.json')
try {
const pluginPackageJson = require(pluginPackageJsonFile)
if (pluginPackageJson.cds?.schema) {
await mergePluginSchema(pluginPackageJson.cds.schema).into(result);
}
} catch (err) {
if (err.code !== 'ENOENT') {
console.error(`Error loading schema from plugin ${key}: ${err.message}`)
}
}
}
}
} finally {
pluginsLib.require = oldRequire
}
}
} catch (err) {
console.error(err.message)
}
result.title = oldTitle
result.description = oldDescription
return result
}

@@ -7,3 +7,3 @@ module.exports = {

MAVEN_ARCHETYPE_VERSION: '2.4.1',
MAVEN_ARCHETYPE_VERSION: '2.5.0',

@@ -10,0 +10,0 @@ OPTIONS: Object.freeze({

@@ -237,3 +237,3 @@ const YAML = require('@sap/cds-foss').yaml

if (!templateCollection) return
if (!templateCollection?.items) return

@@ -385,2 +385,3 @@ templateCollection.items.forEach((templateNode, templateIndex) => {

node = node.get(key)
if(node === undefined) break;
}

@@ -387,0 +388,0 @@ }

@@ -12,5 +12,4 @@ const { readProject } = require('../../projectReader')

async canRun() {
const { hasMta, hasHelm, hasHelmUnifiedRuntime, isJava } = readProject()
const { hasMta, hasHelm, hasHelmUnifiedRuntime } = readProject()
if ((hasHelmUnifiedRuntime || hasHelm) && !hasMta) throw `'cds add application-logging' is not available for Kyma yet`
if (isJava) throw `'cds add application-logging' is not available for Java yet`
return true

@@ -17,0 +16,0 @@ }

{
"name": "approuter",
"dependencies": {
"@sap/approuter": "^14.0.0"
"@sap/approuter": "^16.0.0"
},

@@ -6,0 +6,0 @@ "engines": {

[
"src/gen/**/*.hdbview",
"src/gen/**/*.hdbindex",
"src/gen/**/*.hdbconstraint"
]
"src/gen/**/*.hdbconstraint",
"src/gen/**/*_drafts.hdbtable"
]
const cds = require('../../..')
const { copy, rimraf, exists } = cds.utils
const { copy, rimraf, exists, fs } = cds.utils
const { join } = require('path')

@@ -22,3 +22,4 @@

static hasInProduction() {
return exists(join('chart', 'values.yaml')) && !exists(join('chart', 'charts', 'web-application')) || cds.cli.options.add.has('helm-unified-runtime')
const chart = exists(join('chart/Chart.yaml')) ? fs.readFileSync(join(cds.root, 'chart/Chart.yaml'), 'utf8') : ''
return chart.includes('https://int.repositories.cloud.sap/artifactory/virtual-unified-runtime-helm-dmz') || cds.cli.options.add.has('helm-unified-runtime')
}

@@ -25,0 +26,0 @@

@@ -9,5 +9,7 @@ {{- printf "Thank you for installing %s version %s." .Chart.Name .Chart.Version }}

{{- $deployment := (get $root.Values $name) }}
{{- if $deployment.expose.enabled }}
{{- $ := merge (dict "name" $name "deployment" $deployment) $root }}
{{- $url := include "cap.deploymentHostFull" $ }}
{{ printf "%s - https://%s" $name $url }}
{{- end -}}
{{- end }}

@@ -17,2 +17,3 @@ const { join } = require('path');

// Won't need this any more with plugin approach: package not there -> plugin not pulled
static hasInProduction() {

@@ -48,16 +49,4 @@ const { add } = cds.cli.options

await this._mergeDependency('content-deployment', 'html5-apps-deployer')
if (hasApprouter) {
await this._mergeDependency('service-instance', 'html5-apps-repo-runtime')
await merge({
'html5-apps-repo-runtime': {
serviceOfferingName: 'html5-apps-repo',
servicePlanName: 'app-runtime'
},
approuter: {
bindings: {
'html5-apps-repo-runtime': { serviceInstanceName: 'html5-apps-repo-runtime' }
}
},
}).into('chart/values.yaml')
}

@@ -64,0 +53,0 @@ }

@@ -6,3 +6,3 @@ const { join } = require('path')

const { readProject } = require('../../projectReader')
const { merge, sort } = require('../../merge')
const { merge, sort, removeFromYAML } = require('../../merge')
const { copyRenderedJSON } = require('../../../util/fs')

@@ -106,3 +106,4 @@ const {

await copy(join(__dirname, 'files', `mtxs-configmap-${isJava ? 'java' : 'nodejs'}.yaml`)).to('chart', 'templates', 'mtxs-configmap.yaml') // REVISIT: Move to build task
await removeFromYAML(join('chart', 'values.yaml'), ['srv.expose', 'srv.networkSecurity'])
await copy(join(__dirname, 'files', `mtxs-configmap-${isJava ? 'java' : 'nodejs'}.yaml`)).to('chart', 'templates', 'mtxs-configmap.yaml') // REVISIT: Move to build task
await copy(join(__dirname, 'files', 'saas-registry-secret.yaml')).to('chart', 'templates', 'saas-registry-secret.yaml') // REVISIT: Move to build task

@@ -109,0 +110,0 @@ }

@@ -8,3 +8,3 @@ const fs = require('fs').promises;

require('../util/axios').pruneErrors().setReqTimeout();
require('../util/axios').pruneErrors();
const { collectFileContent, collectFiles, toUnixPath } = require('./util/fs');

@@ -11,0 +11,0 @@ const { getMessage } = require('./util/logging');

@@ -7,3 +7,3 @@ const axios = require('axios');

require('../util/axios').pruneErrors().setReqTimeout();
require('../util/axios').pruneErrors();
const { getMessage } = require('./util/logging');

@@ -131,4 +131,6 @@

const params = new ParamCollection(paramValues);
await SettingsManager.setKeytar(params, true);
if (params.get('clearInvalid')) {
await SettingsManager.deleteInvalid();
await SettingsManager.deleteInvalidSettings();
await SettingsManager.deleteInvalidTokens();
} else {

@@ -135,0 +137,0 @@ await SettingsManager.deleteToken(params);

@@ -7,3 +7,3 @@ const fs = require('fs');

require('../util/axios').pruneErrors().setReqTimeout();
require('../util/axios').pruneErrors();
const { handleHttpError } = require('./util/errors');

@@ -10,0 +10,0 @@

@@ -7,3 +7,3 @@ const fs = require('fs');

require('../util/axios').pruneErrors().setReqTimeout();
require('../util/axios').pruneErrors();
const { normalizePath, isParent } = require('./util/fs');

@@ -167,3 +167,3 @@ const Question = require('./util/question');

static promptTemplateOverwrite() {
return Question.askBooleanQuestion(`This will overwrite extisting templates in folder ${TEMPLATE_FOLDER}. Continue (yN)? `);
return Question.askBooleanQuestion(`This will overwrite existing templates in folder ${TEMPLATE_FOLDER}. Continue (yN)? `);
}

@@ -170,0 +170,0 @@

@@ -10,3 +10,3 @@ const cds = require ('../cds')

require('../util/axios').pruneErrors().setReqTimeout();
require('../util/axios').pruneErrors();
const { getMessage } = require('./util/logging');

@@ -13,0 +13,0 @@ const { handleHttpError } = require('./util/errors');

@@ -9,3 +9,3 @@ const cds = require ('../cds'), { local } = cds.utils;

require('../util/axios').pruneErrors().setReqTimeout();
require('../util/axios').pruneErrors();
const { getMessage } = require('./util/logging');

@@ -12,0 +12,0 @@ const { schemaRegex } = require('./util/urls');

@@ -53,2 +53,3 @@ const fs = require('fs');

let keytar;
let keytarDisabled;

@@ -130,2 +131,10 @@ function other(tokenStorage) {

function getAccountUrl(account) {
return account.includes('|') ? account.split('|')[0] : null;
}
function getAccountSubdomain(account) {
return account.includes('|') ? account.split('|')[1] : null;
}
function logConfigPaths() {

@@ -150,2 +159,4 @@ DEBUG?.(`Settings are stored in ${CONFIG.paths.settings}`);

logConfigPaths();
keytar = undefined;
keytarDisabled = undefined;
}

@@ -181,3 +192,3 @@

params.delete('clearOtherTokenStorage');
await this.deleteToken(params, true);
await this.deleteToken(params, { fromOtherStorage: true });
}

@@ -266,3 +277,3 @@

static async setKeytar(params, logout = false) {
if (params.get('skipToken')) {
if (keytar || keytarDisabled || params.get('skipToken')) {
return;

@@ -281,3 +292,3 @@ }

}
// keytar missing
keytarDisabled = true;

@@ -327,3 +338,15 @@ if (params.get('tokenStorage') === 'keyring' && params.get('saveData')) {

}
function setRelevantAuth(auth) {
async function addAuthSettings() {
function removeObsoleteToken() {
if (params.get('renewLogin') || params.get('tokenExpirationDate') <= Date.now()) {
DEBUG?.((params.get('renewLogin') ? 'Renewing' : 'Refreshing expired') + ' authentication token');
params.delete('token');
params.delete('tokenExpirationDate');
}
}
const auth = {
plain: await SettingsManager._loadAuthFromFile(params.get('appUrl'), params.get('subdomain')),
keyring: await SettingsManager._loadAuthFromKeyring(params.get('appUrl'), params.get('subdomain'))
};
if (!(auth.keyring.has('tokenStorage') || auth.plain.has('tokenStorage'))) {

@@ -358,8 +381,4 @@ // Saved auth data not present.

}
}
function removeObsoleteToken() {
if (params.get('renewLogin') || params.get('tokenExpirationDate') <= Date.now()) {
DEBUG?.((params.get('renewLogin') ? 'Renewing' : 'Refreshing expired') + ' authentication token');
params.delete('token');
params.delete('tokenExpirationDate');
if (params.has('token')) {
removeObsoleteToken();
}

@@ -398,11 +417,3 @@ }

const auth = {
plain: await this._loadAuthFromFile(params.get('appUrl'), params.get('subdomain')),
keyring: await this._loadAuthFromKeyring(params.get('appUrl'), params.get('subdomain'))
};
setRelevantAuth(auth);
if (params.has('token')) {
removeObsoleteToken();
}
await addAuthSettings();
if (!logout && !params.has('token') && !params.has('refreshToken') && !params.has('passcode') && !params.has('clientsecret') && !params.has('key')) {

@@ -413,3 +424,3 @@ await addPasscode();

static async deleteToken(params, fromOtherStorage = false) {
static async deleteToken(params, { fromOtherStorage = false, invalid = false } = {}) {
const allParams = params.clone();

@@ -422,7 +433,9 @@ await this.loadAndMergeSettings(allParams, true);

DEBUG?.(`Deleting authentication data${fromOtherStorage ? ' from other storage' : ''} for ${target}`);
DEBUG?.(`Deleting${invalid ? ' invalid' : ''} authentication data${fromOtherStorage ? ' from other storage' : ''} for ${target}`);
fromStorage = fromStorage || allParams.get('tokenStorage');
if (!fromStorage) {
console.log('Failed to delete authentication data: none found for', target);
if (!invalid) {
console.log('Failed to delete authentication data: none found for', target);
}
return;

@@ -444,3 +457,3 @@ }

static async deleteInvalid() {
static async deleteInvalidSettings() {
const settingsByFolder = await this._loadFromFile(undefined);

@@ -466,3 +479,3 @@ const deletionFolders = [];

if (!urlReference) {
await this.deleteToken(new ParamCollection({ appUrl, subdomain }));
await this.deleteToken(new ParamCollection({ appUrl, subdomain }), { invalid: true });
}

@@ -472,6 +485,24 @@ }

} else {
console.log('All settings seem valid');
console.log('All saved project folders seem valid');
}
}
static async deleteInvalidTokens() {
async function deleteFrom(allAuth) {
for (const [appUrl, authForUrl = {}] of Object.entries(allAuth)) {
for (const [subdomain, auth] of Object.entries(authForUrl)) {
if (!auth.token) {
continue;
}
if (auth.tokenExpirationDate <= Date.now()) {
await SettingsManager.deleteToken(new ParamCollection({ appUrl, subdomain }), { invalid: true });
}
}
}
}
await deleteFrom(await this._loadAuthFromFile(undefined, undefined));
await deleteFrom(await this._loadAllAuthFromKeyring());
}
static async _saveToFile(projectFolder, paramValues) {

@@ -576,3 +607,3 @@ const paramValuesByFolder = await this._loadFromFile(undefined);

} catch (err) {
DEBUG?.('Empty authentication-data file');
DEBUG?.('File contains invalid saved auth');
return emptyData(all);

@@ -631,2 +662,22 @@ }

static async _loadAllAuthFromKeyring() {
if (!keytar) {
return {};
}
return (await keytar.findCredentials(MTX_FULLY_QUALIFIED))
.reduce((result, { account, password: authString }) => {
const appUrl = getAccountUrl(account);
const subdomain = getAccountSubdomain(account);
if (appUrl && subdomain) {
try {
(result[appUrl] ??= {})[subdomain] = JSON.parse(authString);
} catch (error) {
throw new Error(`${capitalize(TOKEN_STORAGE_DESC.keyring)} contains invalid saved auth for URL ${appUrl} and subdomain ${subdomain}`);
}
}
return result;
}, {});
}
}

@@ -633,0 +684,0 @@

@@ -5,3 +5,3 @@ const cds = require('../cds');

require('../util/axios').pruneErrors().setReqTimeout();
require('../util/axios').pruneErrors();
const { getMessage } = require('./util/logging');

@@ -8,0 +8,0 @@ const { capitalize } = require('./util/strings');

@@ -5,3 +5,2 @@ const path = require('path');

const { concatUrls } = require('./urls');
require('../../util/axios').pruneErrors().setReqTimeout();

@@ -8,0 +7,0 @@ const MTX_URL_PATH_PREFIX = 'mtx/v1/model/';

module.exports = {
pruneErrors,
setReqTimeout
pruneErrors
};
// Will only count the time until the server starts responding, not the time until the request is finished.
const GENERIC_REQUEST_TIMEOUT_MS = 1500;
function setReqTimeout() {
require('axios').interceptors.request.use(
config => {
config.timeout = GENERIC_REQUEST_TIMEOUT_MS;
return config;
},
error => {
return Promise.reject(error)
}
);
return module.exports;
}
function pruneErrors() {

@@ -24,0 +6,0 @@ function stacklessError(message) {

@@ -119,3 +119,3 @@ // eslint-disable-next-line no-console

apiEndpoint: this._extract(config.Target, /\s*(.+)\s*/, `CF API endpoint is missing. Use 'cf login' to login.`),
org: this._extract(config.OrganizationFields.Name, /\s*(.+)\s*/, `CF org is missing. Use 'cf target -o <ORG> to specify.`),
org: this._extract(config.OrganizationFields.Name, /\s*(.+)\s*/, `CF org is missing. Use 'cf target -o <ORG>' to specify.`),
space: this._extract(config.SpaceFields.Name, /\s*(.+)\s*/, `CF space is missing. Use 'cf target -s <SPACE>' to specify.`)

@@ -135,3 +135,3 @@ }

user: this._extract(result.stdout, /user\s*:\s*(.+)/i, `CF user is missing. Use 'cf login' to login.`),
org: this._extract(result.stdout, /org\s*:\s*(.+)/i, `CF org is missing. Use 'cf target -o <ORG> to specify.`),
org: this._extract(result.stdout, /org\s*:\s*(.+)/i, `CF org is missing. Use 'cf target -o <ORG>' to specify.`),
space: this._extract(result.stdout, /space\s*:\s*(.+)/i, `CF space is missing. Use 'cf target -s <SPACE>' to specify.`),

@@ -194,2 +194,10 @@ };

async getServices(app) {
const { spaceGuid: space_guids, orgGuid: organization_guids } = await this.getCfSpaceInfo();
const apps = await this._cfRequest('/v3/apps', { space_guids, organization_guids, names: app });
const guid = apps.resources[0].guid;
const env = await this._cfRequest('/v3/apps/'+guid+'/env', { space_guids, organization_guids, names: app });
return Object.values(env.system_env_json.VCAP_SERVICES).flat();
}
async getService(serviceName, showMessage = true) {

@@ -329,11 +337,10 @@ showMessage && console.log(`Getting service ${bold(serviceName)}`);

async getOrCreateServiceKey(serviceInstance, serviceKeyName) {
async getOrCreateServiceKey(serviceInstance, serviceKeyName, parameters, { silent = false, create = true } = {}) {
const serviceKey = await this.getServiceKey(serviceInstance, serviceKeyName, false);
if (serviceKey) {
console.log(`Getting service key ${bold(serviceKeyName)}`);
if (serviceKey || !create) {
if (serviceKey && !silent) console.log(`Getting service key ${bold(serviceKeyName)}`);
return serviceKey;
}
console.log(`Creating service key ${bold(serviceKeyName)} - please be patient...`);
if (!silent) console.log(`Creating service key ${bold(serviceKeyName)} - please be patient...`);

@@ -350,5 +357,3 @@ const body = {

},
parameters: {
permissions: 'development'
}
parameters
}

@@ -355,0 +360,0 @@

{
"name": "@sap/cds-dk",
"version": "7.5.1",
"version": "7.6.0",
"description": "Command line client and development toolkit for the SAP Cloud Application Programming Model",

@@ -5,0 +5,0 @@ "homepage": "https://cap.cloud.sap/",

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc