Socket
Socket
Sign inDemoInstall

@stencila/dockter

Package Overview
Dependencies
Maintainers
2
Versions
42
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@stencila/dockter - npm Package Compare versions

Comparing version 0.3.0 to 0.3.1

94

CONTRIBUTING.md

@@ -0,1 +1,19 @@

# Contributing
🎉 Thanks for taking the time to contribute to Dockter! 🎉
# Table of Contents
[General contribution guidelines](#general-contribution-guidelines)
* [Licensing and contributor agreement](#licensing-and-contributor-agreement)
[Development](#development)
* [Development environment](#development-environment)
* [Linting and testing](#linting-and-testing)
* [Documentation generation](#documentation-generation)
* [Commit messages](#commit-messages)
* [Continuous integration](#continuous-integration)
* [using-the-router-and-server](using-the-router-and-server)
# General contribution guidelines

@@ -6,2 +24,5 @@

These are mostly guidelines, not rules.
Use your best judgment, and feel free to propose changes to this document in a pull request.
If you are comfortable with Git and GitHub, you can submit a pull request (PR). In Stencila we follow a commonly used workflow

@@ -15,3 +36,3 @@ for [contributing to open source projects][how-contribute] (see also [GitHub instructions][github-flow]).

## Licensing and contributor agreement
## Licensing and code of conduct

@@ -21,4 +42,15 @@ By contributing, you agree that we may redistribute your work under [our license](LICENSE).

# Development
## Get in touch!
You can chat with the team at our [community forum][community-forum],
on Twitter [@Stencila][stencila-twitter],
[Gitter][stencila-gitter], or email to [hello@stenci.la][contact]
## Development
### Getting started
### Development environment
Dockter is implemented as a `Node.js` package in order to make it easier to integrate with other Stencila components written also in this language.

@@ -44,5 +76,19 @@ Therefore, in order to develop Dockter you need to have `Node.js` installed on your machine, along with `npm`.

## Architecture
Dockter implements a compiler design pattern. Source files are _parsed_ into a `SoftwareEnvironment` instance (the equivalent of an AST (Abstract Syntax Tree) in other programming language compilers) which is then used to generate a `Dockerfile` which is then built into a Docker image.
The parser classes e.g. `PythonParser`, `RParser` scan for relevant source files and generate `SoftwareEnvironment` instances.
The generator classes e.g. `PythonGenerator`, `RGenerator` generates a `Dockerfile` for a given `SoftwareEnvironment`.
`DockerGenerator` is a super-generator which combines the other generators.
`DockerBuilder` class builds
`DockerCompiler` links all of these together.
For example, if a folder has single file in it `code.py`, `PythonParser` will parse that file and create a `SoftwareEnvironment` instance, which `DockerGenerator` uses to generate a `Dockerfile`, which `DockerBuilder` uses to build a Docker image.
### Linting and testing
Then take a look at the docs ([online](https://stencila.github.io/dockter/) or inline) and start hacking! Please check that your changes pass linting and unit tests,
Please check that your changes pass linting and unit tests,

@@ -54,7 +100,7 @@ ```bash

Use `npm test -- <test file path>` to run a single test file
Use `npm test -- <test file path>` to run a single test file.
You can setup a Git pre-commit hook to perform these checks automatically before each commit using `make hooks`.
Check that any changes you've made are covered 🏅 by unit tests using,
You can check that any changes you've made are covered 🏅 by unit tests using,

@@ -77,3 +123,4 @@ ```bash

Please use [conventional changelog](https://github.com/conventional-changelog/conventional-changelog) style commit messages e.g. `docs(readme): fixed spelling mistake`. This help with automated semantic versioning. To make this easier, [Commitzen](http://commitizen.github.io/cz-cli/) is a development dependency and can be used via `npm` or `make`:
Please use [conventional changelog](https://github.com/conventional-changelog/conventional-changelog) style commit messages e.g. `docs(readme): fixed spelling mistake`. See [the specifications](https://www.conventionalcommits.org/en/v1.0.0-beta.2/) for full details. This help with automated semantic versioning.
To make this easier, [Commitzen](http://commitizen.github.io/cz-cli/) is a development dependency and can be used via `npm` or `make`:

@@ -86,29 +133,30 @@ ```bash

Linting, test coverage, binary builds, package builds, and documentation generation are done on each push on [Travis CI](https://travis-ci.org/stencila/dockter). [`semantic-release`](https://github.com/semantic-release/semantic-release) is enabled to automate version management, Github releases and NPM package publishing.
Linting, test coverage, binary builds, package builds, and documentation generation are done on [Travis CI](https://travis-ci.org/stencila/dockter). [`semantic-release`](https://github.com/semantic-release/semantic-release) is enabled to automate version management: minor version releases are done if any `feat(...)` commits are pushed, patch version releases are done if any `fix(...)` commits are pushed. Releases are made to [NPM](https://www.npmjs.com/package/@stencila/dockter) and [Github Releases](https://github.com/stencila/dockter/releases).
### Using the router and server
## See also
The [Express](https://expressjs.com) router provides `PUT /compile` and `PUT /execute` endpoints (which do the same thing as the corresponding CLI commands). You can serve them using,
Related Stencila packages include:
```bash
npm start
```
- 🦄 [`stencila/tunix`](https://github.com/stencila/tunix): compiles JSON-LD `SoftwareEnvironment` nodes to [NixOS](https://nixos.org/) environments
- 🦄 [`stencila/kubex`](https://github.com/stencila/kubex): executes JSON-LD `SoftwareEnvironment` nodes on [Kubernetes](https://kubernetes.io/) clusters
Or, during development using,
There are several projects that create Docker images from source code and/or requirements files:
```bash
npm run server
```
- [`alibaba/derrick`](https://github.com/alibaba/derrick)
- [`jupyter/repo2docker`](https://github.com/jupyter/repo2docker)
- [`Gueils/whales`](https://github.com/Gueils/whales)
- [`o2r-project/containerit`](https://github.com/o2r-project/containerit)
- [`openshift/source-to-image`](https://github.com/openshift/source-to-image)
- [`ViDA-NYU/reprozip`](https://github.com/ViDA-NYU/reprozip])
A minimal example of how to integrate the router into your own Express server,
Dockter is similar to `repo2docker`, `containerit`, and `reprozip` in that it is aimed at researchers doing data analysis (and supports R) whereas most other tools are aimed at software developers (and don't support R). Dockter differs to these projects principally in that by default (but optionally) it installs the necessary Stencila language packages so that the image can talk to Stencila client interfaces an provide code execution services. Like `repo2docker` it allows for multi-language images but has the additional features of package dependency analysis of source code, managed builds and generated of image meta-data.
```js
const app = require('express')()
const { docker } = require('@stencila/dockter')
const app = express()
app.use('/docker', docker)
app.listen(3000)
```
## Get in touch!
You can chat with the team at our [community forum][community-forum],
on Twitter [@Stencila][stencila-twitter],
[Gitter][stencila-gitter], or email to [hello@stenci.la][contact]

@@ -115,0 +163,0 @@ [contact]: mailto:hello@stenci.la

@@ -38,3 +38,3 @@ #!/usr/bin/env node

// @ts-ignore
.command('compile [folder] [format]', 'Compile a project to a software environment', yargs => {
.command('compile [folder]', 'Compile a project to a software environment', yargs => {
yargs.positional('folder', {

@@ -44,11 +44,5 @@ type: 'string',

describe: 'The path to the project folder'
}),
yargs.positional('format', {
type: 'string',
default: 'json',
describe: 'Format to output the environment: json or yaml'
});
});
}, async (args) => {
const node = await compiler.compile('file://' + args.folder, false).catch(error);
output(node, args.format);
await compiler.compile('file://' + args.folder, false).catch(error);
})

@@ -68,3 +62,3 @@ // Build command

// @ts-ignore
.command('execute [folder] [format]', 'Execute a project', yargs => {
.command('execute [folder]', 'Execute a project', yargs => {
yargs.positional('folder', {

@@ -71,0 +65,0 @@ type: 'string',

@@ -1,2 +0,2 @@

import { SoftwareEnvironment } from './context';
import { SoftwareEnvironment } from '@stencila/schema';
export default class DockerCompiler {

@@ -3,0 +3,0 @@ /**

@@ -8,3 +8,3 @@ "use strict";

const path_1 = __importDefault(require("path"));
const context_1 = require("./context");
const schema_1 = require("@stencila/schema");
const DockerParser_1 = __importDefault(require("./DockerParser"));

@@ -39,3 +39,3 @@ const RParser_1 = __importDefault(require("./RParser"));

// Read existing environment from file
environ = new context_1.SoftwareEnvironment();
environ = new schema_1.SoftwareEnvironment();
const object = JSON.parse(fs_1.default.readFileSync(path_1.default.join(folder, 'environ.jsonld'), 'utf8'));

@@ -58,3 +58,3 @@ environ = Object.assign(environ, object);

if (!environ)
environ = new context_1.SoftwareEnvironment();
environ = new schema_1.SoftwareEnvironment();
// Save environ as an intermediate file

@@ -61,0 +61,0 @@ fs_1.default.writeFileSync(path_1.default.join(folder, '.environ.jsonld'), JSON.stringify(environ, null, ' '));

import Generator from './Generator';
import { SoftwareEnvironment } from './context';
import { SoftwareEnvironment } from '@stencila/schema';
/**

@@ -4,0 +4,0 @@ * A Dockerfile generator that collects instructions from

@@ -0,3 +1,3 @@

import { SoftwareEnvironment } from '@stencila/schema';
import Parser from './Parser';
import { SoftwareEnvironment } from './context';
/**

@@ -4,0 +4,0 @@ * Parser for Dockerfiles

@@ -6,5 +6,5 @@ "use strict";

Object.defineProperty(exports, "__esModule", { value: true });
const schema_1 = require("@stencila/schema");
const docker_file_parser_1 = __importDefault(require("docker-file-parser"));
const Parser_1 = __importDefault(require("./Parser"));
const context_1 = require("./context");
/**

@@ -66,3 +66,3 @@ * Parser for Dockerfiles

}
const environ = new context_1.SoftwareEnvironment();
const environ = new schema_1.SoftwareEnvironment();
// Parse instructions from the Dockerfile

@@ -93,3 +93,3 @@ const instructions = docker_file_parser_1.default.parse(dockerfile);

case 'author':
environ.authorsPush(context_1.Person.fromText(value));
environ.authors.push(schema_1.Person.fromText(value));
break;

@@ -106,3 +106,3 @@ }

throw new Error(`Unexpected type of instruction arguments ${typeof instruction.args}`);
environ.authorsPush(context_1.Person.fromText(author));
environ.authors.push(schema_1.Person.fromText(author));
}

@@ -109,0 +109,0 @@ return environ;

import Doer from './Doer';
import { SoftwareEnvironment, SoftwarePackage } from './context';
import { SoftwareEnvironment, SoftwarePackage } from '@stencila/schema';
/**

@@ -4,0 +4,0 @@ * Generates a Dockerfile for a `SoftwareEnvironment` instance

import Doer from './Doer';
import { SoftwareEnvironment } from './context';
import { SoftwareEnvironment } from '@stencila/schema';
/**

@@ -4,0 +4,0 @@ * A base class for language parsers

import Generator from './Generator';
import { SoftwareEnvironment } from './context';
import { SoftwareEnvironment } from '@stencila/schema';
/**

@@ -4,0 +4,0 @@ * A Dockerfile generator for Python environments

@@ -38,3 +38,3 @@ "use strict";

}
return this.filterPackages('Python').map(requirement => requirement.identifier()).join('\n');
return this.filterPackages('Python').map(requirement => `${requirement.name}${requirement.version}`).join('\n');
}

@@ -41,0 +41,0 @@ installFiles(sysVersion) {

import Parser from './Parser';
import { SoftwareEnvironment } from './context';
import { SoftwareEnvironment } from '@stencila/schema';
export declare enum RequirementType {

@@ -4,0 +4,0 @@ Named = 0,

@@ -7,3 +7,3 @@ "use strict";

const Parser_1 = __importDefault(require("./Parser"));
const context_1 = require("./context");
const schema_1 = require("@stencila/schema");
const path_1 = require("path");

@@ -62,3 +62,3 @@ const REQUIREMENTS_COMMENT_REGEX = /^\s*#/;

}
const environ = new context_1.SoftwareEnvironment();
const environ = new schema_1.SoftwareEnvironment();
if (this.folder) {

@@ -70,3 +70,3 @@ environ.name = path_1.basename(this.folder);

if (rawRequirement.type === RequirementType.Named) {
let standardRequirement = new context_1.SoftwarePackage();
let standardRequirement = new schema_1.SoftwarePackage();
standardRequirement.name = rawRequirement.value;

@@ -77,6 +77,6 @@ standardRequirement.runtimePlatform = 'Python';

}
environ.softwareRequirementsPush(standardRequirement);
environ.softwareRequirements.push(standardRequirement);
}
else if (rawRequirement.type === RequirementType.URL) {
let sourceRequirement = new context_1.SoftwareSourceCode();
let sourceRequirement = new schema_1.SoftwareSourceCode();
sourceRequirement.runtimePlatform = 'Python';

@@ -83,0 +83,0 @@ sourceRequirement.codeRepository = rawRequirement.value;

import Parser from './Parser';
import { SoftwareEnvironment } from './context';
import { SoftwareEnvironment } from '@stencila/schema';
/**

@@ -4,0 +4,0 @@ * Dockter `Parser` class for R requirements files and source code.

@@ -8,3 +8,3 @@ "use strict";

const Parser_1 = __importDefault(require("./Parser"));
const context_1 = require("./context");
const schema_1 = require("@stencila/schema");
/**

@@ -25,3 +25,3 @@ * Dockter `Parser` class for R requirements files and source code.

async parse() {
const environ = new context_1.SoftwareEnvironment();
const environ = new schema_1.SoftwareEnvironment();
let name;

@@ -129,3 +129,3 @@ let version;

// Thing > CreativeWork > SoftwareSourceCode > SoftwarePackage
const pkg = new context_1.SoftwarePackage();
const pkg = new schema_1.SoftwarePackage();
pkg.name = name;

@@ -152,13 +152,13 @@ // These packages are built-in to R distributions, so we don't need to collect

const name = match[1];
const person = context_1.Person.fromText(name);
const person = schema_1.Person.fromText(name);
const roles = match[2].split(', ');
if (roles.includes('aut'))
context_1.push(pkg, 'authors', person);
pkg.authors.push(person);
if (roles.includes('ctb'))
context_1.push(pkg, 'contributors', person);
pkg.contributors.push(person);
if (roles.includes('cre'))
context_1.push(pkg, 'creators', person);
pkg.creators.push(person);
}
else {
context_1.push(pkg, 'authors', context_1.Person.fromText(author));
pkg.authors.push(schema_1.Person.fromText(author));
}

@@ -190,6 +190,6 @@ });

// Handle strings e.g. curl https://sysreqs.r-hub.io/pkg/XML
const required = new context_1.SoftwarePackage();
const required = new schema_1.SoftwarePackage();
required.name = debPackage;
required.runtimePlatform = 'deb';
pkg.softwareRequirementsPush(required);
pkg.softwareRequirements.push(required);
}

@@ -200,12 +200,12 @@ else if (Array.isArray(debPackage)) {

if (deb.buildtime) {
const required = new context_1.SoftwarePackage();
const required = new schema_1.SoftwarePackage();
required.name = deb.buildtime;
required.runtimePlatform = 'deb';
pkg.softwareRequirementsPush(required);
pkg.softwareRequirements.push(required);
}
if (deb.runtime) {
const required = new context_1.SoftwarePackage();
const required = new schema_1.SoftwarePackage();
required.name = deb.runtime;
required.runtimePlatform = 'deb';
pkg.softwareRequirementsPush(required);
pkg.softwareRequirements.push(required);
}

@@ -212,0 +212,0 @@ }

@@ -15,6 +15,8 @@ ## Using Dockter

Dockter is a tool to make it easier for researchers to create reproducible research environments. It generates a Docker image for a research project based on the source code in it. That means that you don’t need to learn a new file format (`Dockerfiles`) to create Docker images.
Dockter is a tool to make it easier for researchers to create reproducible research environments. It generates a Docker image for a research project based on the source code in it. That means that you don’t need to learn a new file format (`Dockerfiles`) to create Docker images. Dockter makes it also easy to track dependencies and update the image when they change.
In addition, Dockter manages the image building process to better fit your everyday workflow (determines package system dependencies, manages builds) and generates JSON-LD meta-data describing the image (from which citations etc can be generated).
You can manually edit the files which Dockter generates so that the containers build of it exactly suit your project.
#### 1. Install

@@ -132,3 +134,3 @@

#### Resources
### Resources

@@ -135,0 +137,0 @@ [A beginner's guide to containters](https://medium.freecodecamp.org/a-beginner-friendly-introduction-to-containers-vms-and-docker-79a9e3e119b)

{
"name": "@stencila/dockter",
"version": "0.3.0",
"version": "0.3.1",
"description": "A Docker image builder for researchers",

@@ -54,5 +54,3 @@ "main": "dist/index.js",

"@types/node-persist": "0.0.33",
"@types/request": "^2.47.1",
"@types/tar-fs": "^1.16.1",
"@types/tar-stream": "^1.6.0",
"@types/tmp": "0.0.33",

@@ -75,2 +73,3 @@ "@types/yargs": "^12.0.1",

"dependencies": {
"@stencila/schema": "0.1.2",
"docker-file-parser": "^1.0.4",

@@ -83,6 +82,4 @@ "dockerode": "^2.5.7",

"node-persist": "^3.0.1",
"request": "^2.88.0",
"rimraf": "^2.6.2",
"tar-fs": "^1.16.3",
"tar-stream": "^1.6.2",
"tmp": "0.0.33",

@@ -89,0 +86,0 @@ "yargonaut": "^1.1.4",

<div align="center">
<img src="http://stenci.la/img/logo-name.png" alt="Stencila" style="max-width:100px">
<img src="http://stenci.la/img/logo-name.png" alt="Stencila" style="max-height:60px">
</div>

@@ -8,2 +8,4 @@ <br>

> :sparkles: Help us [choose a better name](https://github.com/stencila/dockter/issues/37) for this project! :sparkles:
[![Build status](https://travis-ci.org/stencila/dockter.svg?branch=master)](https://travis-ci.org/stencila/dockter)

@@ -13,9 +15,11 @@ [![Code coverage](https://codecov.io/gh/stencila/dockter/branch/master/graph/badge.svg)](https://codecov.io/gh/stencila/dockter)

[![NPM](http://img.shields.io/npm/v/@stencila/dockter.svg?style=flat)](https://www.npmjs.com/package/@stencila/dockter)
[![Docs](https://img.shields.io/badge/docs-latest-blue.svg)](https://stencila.github.io/dockter/)
[![Chat](https://badges.gitter.im/stencila/stencila.svg)](https://gitter.im/stencila/stencila)
[![Docs](https://img.shields.io/badge/docs-API-blue.svg)](https://stencila.github.io/dockter/)
Docker is a good tool for creating reproducible computing environments. But creating truly reproducible Docker images can be difficult. Dockter aims to make it easier for researchers to create Docker images for their research projects. Dockter automatically creates and manages a Docker image for _your_ project based on _your_ source source code.
Docker is a useful tool for creating reproducible computing environments. But creating truly reproducible Docker images can be difficult - even if you already know how to write a `Dockerfile`.
> 🦄 Features that are not yet implemented are indicated by unicorn emoji. Usually they have a link next to them, like this 🦄 [#2](https://github.com/stencila/dockter/issues/2), indicating the relevent issue where you can help make the feature a reality. It's [readme driven development](http://tom.preston-werner.com/2010/08/23/readme-driven-development.html) with calls to action to chase after mythical vaporware creatures! So hip.
Dockter makes it easier for researchers to create Docker images for their research projects. Dockter generates a `Dockerfile` and builds a image, for _your_ project, based on _your_ source code.
> 🦄 Dockter is in early development. Features that are not yet implemented are indicated by unicorn emoji. Usually they have a link next to them, like this 🦄 [#2](https://github.com/stencila/dockter/issues/2), indicating the relevent issue where you can help make the feature a reality. It's [readme driven development](http://tom.preston-werner.com/2010/08/23/readme-driven-development.html) with calls to action to chase after mythical vaporware creatures! So hip.
<!-- Automatically generated TOC. Don't edit, `make docs` instead>

@@ -26,3 +30,3 @@

- [Features](#features)
* [Automatically builds a Docker image for your project](#automatically-builds-a-docker-image-for-your-project)
* [Builds a Docker image for your project sources](#builds-a-docker-image-for-your-project-sources)
+ [R](#r)

@@ -32,3 +36,4 @@ + [Python](#python)

+ [Jupyter](#jupyter)
* [Efficiently handling of updates to project code](#efficiently-handling-of-updates-to-project-code)
* [Quicker re-installation of language packages](#quicker-re-installation-of-language-packages)
+ [An example](#an-example)
* [Generates structured meta-data for your project](#generates-structured-meta-data-for-your-project)

@@ -40,13 +45,6 @@ * [Easy to pick up, easy to throw away](#easy-to-pick-up-easy-to-throw-away)

- [Use](#use)
* [CLI](#cli-1)
+ [Compile a project](#compile-a-project)
+ [Build a Docker image](#build-a-docker-image)
+ [Execute a Docker image](#execute-a-docker-image)
* [Router and server](#router-and-server)
- [Architecture](#architecture)
* [Compile a project](#compile-a-project)
* [Build a Docker image](#build-a-docker-image)
* [Execute a Docker image](#execute-a-docker-image)
- [Contribute](#contribute)
* [Linting and testing](#linting-and-testing)
* [Documentation generation](#documentation-generation)
* [Commit messages](#commit-messages)
* [Continuous integration](#continuous-integration)
- [See also](#see-also)

@@ -60,9 +58,11 @@ - [FAQ](#faq)

### Automatically builds a Docker image for your project
### Builds a Docker image for your project sources
Dockter scans your project folder and builds a Docker image for it. If the the folder already has a Dockerfile, then Dockter will build the image from that. If not, Dockter will scan the files in the folder, generate a `.Dockerfile` and build the image from that. How Dockter builds the image from your source code depends on the language.
Dockter scans your project folder and builds a Docker image for it. If the the folder already has a `Dockerfile`, Dockter will build the image from that. If not, Dockter will scan the source code files in the folder, generate a `.Dockerfile` and use that.
Dockter currently handles R and Python source code (with Node.js support planned). A project can have a mix of these languages.
#### R
If the folder contains a R [`DESCRIPTION`](http://r-pkgs.had.co.nz/description.html) file then Docker will build an image with the R packages listed under `Imports` installed. e.g.
If the folder contains a R package [`DESCRIPTION`](http://r-pkgs.had.co.nz/description.html) file then Dockter will build an image with the R packages listed under `Imports` installed. e.g.

@@ -81,7 +81,7 @@ ```

If the folder contains a `main.R` file, Dockter will set that to be the default script to run in any container created from the image.
Dockter checks if any of your dependenies (or dependencies of dependencies, or dependencies of...) requires system packages (e.g. `libxml-dev`) and installs those too. No more trial and error of build, fail, add dependency, repeat... cycles!
#### Python
If the folder contains a 🦄 [#3](https://github.com/stencila/dockter/issues/3) [`requirements.txt`](https://pip.readthedocs.io/en/1.1/requirements.html) file, or a 🦄 [#4](https://github.com/stencila/dockter/issues/4) [`Pipfile`](https://github.com/pypa/pipfile), Dockter will copy it into the Docker image and use `pip` to install the specified packages.
If the folder contains a [`requirements.txt`](https://pip.readthedocs.io/en/1.1/requirements.html) file, or a 🦄 [#4](https://github.com/stencila/dockter/issues/4) [`Pipfile`](https://github.com/pypa/pipfile), Dockter will copy it into the Docker image and use `pip` to install the specified packages.

@@ -101,8 +101,16 @@ If the folder does not contain either of those files then Dockter will 🦄 [#5](https://github.com/stencila/dockter/issues/5) scan all the folder's `.py` files for `import` statements and create a `.requirements.txt` file for you.

### Efficiently handling of updates to project code
### Quicker re-installation of language packages
Docker layered filesystem has advantages but it can cause real delays when you are updating your project dependencies. For example, see [this issue](https://github.com/npm/npm/issues/11446) for the workarounds used by Node.js developers to prevent long waits when they update their `package.json`. The reason this happens is that when you update a requirements file Docker throws away all the subsequent layers, including the one where you install all your package dependencies.
If you have built a Docker image you'll know that it can be frustrating waiting for *all* your project's dependencies to reinstall when you simply add or remove one of them.
Here's a simple motivating [example](fixtures/tests/py-pandas) of a Dockerized Python project. It's got a [`pip`](https://pypi.org/project/pip/) `requirements.txt` file which specifies that the project requires `pandas` which, to ensure reproducibility, is pinned to version `0.23.0`,
The reason this happens is that, due to Docker's layered filesystems, when you update a requirements file, Docker throws away all the subsequent layers - including the one where you previously installed your dependencies. That means that all those packages need to get reinstalled.
Dockter takes a different approach. It leaves the installation of language packages to the language package managers: Python's [`pip`](https://pypi.org/project/pip/) , Node.js's `npm`, and R's `install.packages`. These package managers are good at the job they were designed for - to check which packages need to be updated and to only update them. The result is much faster rebuilds, especially for R packages, which often involve compilation.
Dockter does this by looking for a special `# dockter` comment in a `Dockerfile`. Instead of throwing away layers, it executes all instructions after this comment in the same layer - thus reusing packages that were previously installed.
#### An example
Here's a simple motivating [example](fixtures/tests/py-pandas). It's a Python project with a `requirements.txt` file which specifies that the project depends upon `pandas` which, to ensure reproducibility, is pinned to version `0.23.0`,
```

@@ -112,3 +120,3 @@ pandas==0.23.0

The project has also got a `Dockerfile` that specifies which Python version we want, copies `requirements.txt` into the image, and installs the packages:
The project also has a `Dockerfile` which specifies which Python version we want to use, copies `requirements.txt` into the image, and uses `pip` to install the packages:

@@ -128,12 +136,11 @@ ```Dockerfile

Docker will download the base Python image (if you don't yet have it), and download five packages (`pandas` and it's four dependencies) and install them. This took over 9 minutes when we ran it.
Docker will download the base Python image (if you don't yet have it), download five packages (`pandas` and it's four dependencies) and install them. This took over 9 minutes when we ran it.
Now, let's say that we want to do some plotting in our library, so we add `matplotlib` as a dependency in `requirements.txt`,
Now, let's say that we want to get the latest version of `pandas` and increment the version in the `requirements.txt` file,
```
pandas==0.23.0
matplotlib==3.0.0
pandas==0.23.1
```
When we do `docker build .` again Docker notices that the `requirements.txt` file has changed and so throws away that layer and all subsequent ones. This means that it will download and install **all** the necessary packages again, including the ones that we previously installed - and takes longer than the first install. For a more contrived illustration of this, simply add a space to a line in the `requirements.txt` file and notice how the package install gets repeated all over again.
When we do `docker build .` again to update the image, Docker notices that the `requirements.txt` file has changed and so throws away that layer and all subsequent ones. This means that it will download and install *all* the necessary packages again, including the ones that we previously installed. For a more contrived illustration of this, simply add a space to one of the lines in the `requirements.txt` file and notice how the package install gets repeated all over again.

@@ -151,3 +158,3 @@ Now, let's add a special `# dockter` comment to the Dockerfile before the `COPY` directive,

The comment is ignored by Docker but tells `dockter` to run all subsequent directives and commit them into a single layer,
The comment is ignored by Docker but tells `dockter` to run all subsequent instructions in a single filesystem layer,

@@ -158,23 +165,57 @@ ```bash

> 🔧 Finish description of commit-based approach and illustrate speed up over normal Docker builds
Now, if you change the `requirements.txt` file, instead of reinstalling everything again, `pip` will only reinstall what it needs to - the updated `pandas` version. The output looks like:
```
Step 1/1 : FROM python:3.7.0
---> a9d071760c82
Successfully built a9d071760c82
Successfully tagged dockter-5058f1af8388633f609cadb75a75dc9d:system
Dockter 1/2 : COPY requirements.txt requirements.txt
Dockter 2/2 : RUN pip install -r requirements.txt
Collecting pandas==0.23.1 (from -r requirements.txt (line 1))
<snip>
Successfully built pandas
Installing collected packages: pandas
Found existing installation: pandas 0.23.0
Uninstalling pandas-0.23.0:
Successfully uninstalled pandas-0.23.0
Successfully installed pandas-0.23.1
```
### Generates structured meta-data for your project
Dockter has been built to expose a JSON-LD API so that it works with other tools. It will parse a Dockerfile into a JSON-LD [`SoftwareSourceCode`](https://schema.org/SoftwareSourceCode) node extracting meta-data about the Dockerfile and build it into a `SoftwareEnvironment` node with links to the source files and the build image.
Dockter uses [JSON-LD](https://json-ld.org/) as it's internal data structure. When it parses your project's source code it generates a JSON-LD tree using a vocabularies from [schema.org](https://schema.org) and [CodeMeta](https://codemeta.github.io/index.html).
> 🔧 Illustrate how this is done for all project sources including non Dockerfiles
For example, It will parse a `Dockerfile` into a schema.org [`SoftwareSourceCode`](https://schema.org/SoftwareSourceCode) node extracting meta-data about the Dockerfile.
> 🔧 Replace this JSON-LD with final version
Dockter also fetches meta data on your project's dependencies, which could be used to generate a complete software citation for your project.
```json
{
"@context": "https://schema.stenci.la",
"type": "SoftwareSourceCode",
"id": "https://hub.docker.com/#sha256:27d6e441706e89dac442c69c3565fc261b9830dd313963cb5488ba418afa3d02",
"author": [],
"text": "FROM busybox\nLABEL description=\"Prints the current date and time at UTC, to the nearest second, in ISO-8601 format\" \\\n author=\"Nokome Bentley <nokome@stenci.la>\"\nCMD date -u -Iseconds\n",
"programmingLanguage": "Dockerfile",
"messages": [],
"description": "Prints the current date and time at UTC, to the nearest second, in ISO-8601 format"
}
"name": "myproject",
"datePublished": "2017-10-19",
"description": "Regression analysis for my data",
"softwareRequirements": [
{
"description": "\nFunctions to Accompany J. Fox and S. Weisberg,\nAn R Companion to Applied Regression, Third Edition, Sage, in press.",
"name": "car",
"urls": [
"https://r-forge.r-project.org/projects/car/",
"https://CRAN.R-project.org/package=car",
"http://socserv.socsci.mcmaster.ca/jfox/Books/Companion/index.html"
],
"authors": [
{
"name": "John Fox",
"familyNames": [
"Fox"
],
"givenNames": [
"John"
]
},
```

@@ -184,7 +225,7 @@

Dockter is designed to make it easier to get started creating Docker images for your project. But it's also designed not to get in your way or restrict you from using bare Docker. You can easily and individually override any of the steps that Dockter takes to build an image.
Dockter is designed to make it easier to get started creating Docker images for your project. But it's also designed not to get in your way or restrict you from using bare Docker. You can easily, and individually, override any of the steps that Dockter takes to build an image.
- *Code analysis*: To stop Dockter doing code analysis and take over specifying your project's package dependencies, just remove the leading '.' from the `.DESCRIPTION`, `.requirements.txt` or `.package.json` file that Dockter generates.
- *Dockerfile generation*: Dockter aims to generate readable Dockerfiles that conform to best practices. They're a good place to start learning how to write your own Dockerfiles. To stop Dockter generating a `.Dockerfile`, and start editing it yourself, just rename it to `Dockerfile`.
- *Dockerfile generation*: Dockter aims to generate readable Dockerfiles that conform to best practices. They 🦄 [#36](https://github.com/stencila/dockter/issues/36) include comments on what each section does and are a good wat to start learning how to write your own Dockerfiles. To stop Dockter generating a `.Dockerfile`, and start editing it yourself, just rename it to `Dockerfile`.

@@ -218,4 +259,2 @@ - *Image build*: Dockter manage builds use a special comment in the `Dockerfile`, so you can stop using Dockter altogether and build the same image using Docker (it will just take longer if you change you project dependencies).

### CLI
The command line tool has three primary commands `compile`, `build` and `execute`. To get an overview of the commands available use the `--help` option i.e.

@@ -233,5 +272,5 @@

#### Compile a project
### Compile a project
The `compile` command compiles a project folder into a specification of a software environment. It scans the folder for source code and package requirement files, parses them, and 🦄 creates an `.environ.jsonld` file. This file contains the information needed to build a Docker image for your project.
The `compile` command compiles a project folder into a specification of a software environment. It scans the folder for source code and package requirement files, parses them, and creates an `.environ.jsonld` file. This file contains the information needed to build a Docker image for your project.

@@ -244,44 +283,21 @@ For example, let's say your project folder has a single R file, `main.R` which uses the R package `lubridate` to print out the current time:

Let's compile that project and inspect the compiled software environment. Change into the project directory and run the `compile` command. The default output format is JSON but you can get YAML, which is easier to read, by using the `--format=yaml` option.
Let's compile that project and inspect the compiled software environment. Change into the project directory and run the `compile` command.
```bash
dockter compile --format=yaml
dockter compile
```
The output from this command is a YAML document describing the project's software environment including it's dependencies (in this case just `lubridate`). The environment's name is taken from the name of the folder, in this case `rdate`.
You should find three new files in the folder created by Dockter:
```yaml
name: rdate
datePublished: '2018-10-21'
softwareRequirements:
- description: |-
Functions to work with date-times and time-spans: fast and user
friendly parsing of date-time data, extraction and updating of components of
a date-time (years, months, days, hours, minutes, and seconds), algebraic
manipulation on date-time and time-span objects. The 'lubridate' package has
a consistent and memorable syntax that makes working with dates easy and
fun.
Parts of the 'CCTZ' source code, released under the Apache 2.0 License,
are included in this package. See <https://github.com/google/cctz> for more
details.
name: lubridate
urls:
- 'http://lubridate.tidyverse.org'
- |-
- `.DESCRIPTION`: A R package description file containing a list of the R packages required and other meta-data
https://github.com/tidyverse/lubridate
authors:
- &ref_0
givenNames:
- Vitalie
familyNames:
- Spinu
name: Vitalie Spinu
...
```
- `.envrion.jsonld`: A JSON-LD document containing structure meatadata on your project and all of its dependencies
#### Build a Docker image
- `.Dockerfile`: A `Dockerfile` generated from `.environ.jsonld`
Usually, you'll compile and build a Docker image for your project in one step using the `build` command. This takes the output of the `compile` command, generates a `.Dockerfile` for it and gets Docker to build that image.
### Build a Docker image
Usually, you'll compile and build a Docker image for your project in one step using the `build` command. This command runs the `compile` command and builds a Docker image from the generated `.Dockerfile` (or handwritten `Dockerfile`):
```bash

@@ -299,5 +315,5 @@ dockter build

#### Execute a Docker image
### Execute a Docker image
You can use Docker to run the created image. Or use Dockter's `execute` command to compile, build and run your docker image in one step:
You can use Docker to run the created image. Or use Dockter's `execute` command to compile, build and run your image in one step:

@@ -309,39 +325,2 @@ ```bash

### Router and server
The [Express](https://expressjs.com) router provides `PUT /compile` and `PUT /execute` endpoints (which do the same thing as the corresponding CLI commands). You can serve them using,
```bash
npm start
```
Or, during development using,
```bash
npm run server
```
A minimal example of how to integrate the router into your own Express server,
```js
const app = require('express')()
const { docker } = require('@stencila/dockter')
const app = express()
app.use('/docker', docker)
app.listen(3000)
```
## Architecture
Dockter implements a compiler design pattern. Source files are _parsed_ into a `SoftwareEnvironment` instance (the equivalent of an AST (Abstract Syntax Tree) in other programming language compilers) which is then used to generate a `Dockerfile` which is then built into a Docker image.
The parser classes e.g. `PythonParser`, `RParser` scan for relevant source files and generate `SoftwareEnvironment` instances.
The generator classes e.g. `PythonGenerator`, `RGenerator` generates a `Dockerfile` for a given `SoftwareEnvironment`.
`DockerGenerator` is a super-generator which combines the other generators.
`DockerBuilder` class builds
`DockerCompiler` links all of these together.
For example, if a folder has single file in it `code.py`, `PythonParser` will parse that file and create a `SoftwareEnvironment` instance, which `DockerGenerator` uses to generate a `Dockerfile`, which `DockerBuilder` uses to build a Docker image.
## Contribute

@@ -351,91 +330,58 @@

To get started on developing the code,
## See also
```bash
git clone https://github.com/stencila/dockter
cd dockter
npm install
```
There are several other projects that create Docker images from source code and/or requirements files including:
Then take a look at the docs ([online](https://stencila.github.io/dockter/) or inline) and start hacking!
- [`alibaba/derrick`](https://github.com/alibaba/derrick)
- [`jupyter/repo2docker`](https://github.com/jupyter/repo2docker)
- [`Gueils/whales`](https://github.com/Gueils/whales)
- [`o2r-project/containerit`](https://github.com/o2r-project/containerit)
- [`openshift/source-to-image`](https://github.com/openshift/source-to-image)
- [`ViDA-NYU/reprozip`](https://github.com/ViDA-NYU/reprozip])
To run the CLI during development use, `npm run cli -- <args>` e.g.
Dockter is similar to `repo2docker`, `containerit`, and `reprozip` in that it is aimed at researchers doing data analysis (and supports R) whereas most other tools are aimed at software developers (and don't support R). Dockter differs to these projects principally in that it:
```bash
npm run cli -- compile tests/fixtures/dockerfile-date/Dockerfile
```
- performs static code analysis for multiple languages to determine package requirements (🦄 well, right now just R).
This uses `ts-node` to compile and run Typescript on the fly so that you don't need to do a build step first.
- uses package databases to determine package system dependencies and generate linked meta-data (`containerit` does this for R).
### Linting and testing
- quicker installation of language package dependencies which can be useful during research projects when requirements often evolve
Please check that your changes pass linting and unit tests,
- by default, but optionally, installs Stencila packages so that Stencila client interfaces can execute code in the container.
```bash
npm run lint # or, make lint
npm test # or, make text
```
`reprozip` and its extension `reprounzip-docker` may be a better choice if you want to share your existing local environment as a Docker image with someone else.
Use `npm test -- <test file path>` to run a single test file.
`containerit` might suit you better if you only need support for R and don't want managed packaged installation
You can setup a Git pre-commit hook to perform these checks automatically before each commit using `make hooks`.
`repo2docker` is likely to be better choice if you want to run Jupyter notebooks or RStudio in your container and don't need source code scanning to detect your requirements
You can check that any changes you've made are covered 🏅 by unit tests using,
If you don't want to build a Docker image and just want a tool that helps determining the package dependencies of your source code check out:
```bash
npm run cover # or, make cover
open coverage/lcov-report/index.html
```
- Node.js: [`detective`](https://github.com/browserify/detective)
- Python: [`modulefinder`](https://docs.python.org/3.7/library/modulefinder.html)
- R: [`requirements`](https://github.com/hadley/requirements)
### Documentation generation
If you've been working on in-code documentation 🙏 you can check that by building and viewing the docs,
## FAQ
```bash
npm run docs # or, make docs
open docs/index.html
```
*Why go to the effort of generating a JSON-LD intermediate representation instead of writing a Dockerfile directly?*
### Commit messages
Having an intermediate representation of the software environment allows this data to be used for other purposes (e.g. software citations, publishing, archiving). It also allows us to reuse much of this code for build targets other than Docker (e.g. Nix) and sources other than code files (e.g. a GUI).
Please use [conventional changelog](https://github.com/conventional-changelog/conventional-changelog) style commit messages e.g. `docs(readme): fixed spelling mistake`. This helps with automated semantic versioning. To make this easier, [Commitzen](http://commitizen.github.io/cz-cli/) is a development dependency and can be used via `npm` or `make`:
*Why is Dockter a Node.js package?*
```bash
npm run commit # or, make commit
```
We've implemented this as a Node.js package for easier integration into Stencila's Node.js based desktop and cloud deployments.
### Continuous integration
*Why is Dockter implemented in Typescript?*
Linting, test coverage, binary builds, package builds, and documentation generation are done on [Travis CI](https://travis-ci.org/stencila/dockter). [`semantic-release`](https://github.com/semantic-release/semantic-release) is enabled to automate version management: minor version releases are done if any `feat(...)` commits are pushed, patch version releases are done if any `fix(...)` commits are pushed. Releases are made to [NPM](https://www.npmjs.com/package/@stencila/dockter) and [Github Releases](https://github.com/stencila/dockter/releases).
We chose Typescript because it's type-checking and type-annotations reduce the number of runtime errors and improves developer experience.
*I'd love to help out! Where do I start?*
## See also
See [CONTRIBUTING.md](CONTRIBUTING.md) (OK, so this isn't asked *that* frequently. But it's worth a try eh :woman_shrugging:.)
There are several projects that create Docker images from source code and/or requirements files:
- [`alibaba/derrick`](https://github.com/alibaba/derrick)
- [`jupyter/repo2docker`](https://github.com/jupyter/repo2docker)
- [`Gueils/whales`](https://github.com/Gueils/whales)
- [`o2r-project/containerit`](https://github.com/o2r-project/containerit)
- [`openshift/source-to-image`](https://github.com/openshift/source-to-image)
- [`ViDA-NYU/reprozip`](https://github.com/ViDA-NYU/reprozip])
Dockter is similar to `repo2docker`, `containerit`, and `reprozip` in that it is aimed at researchers doing data analysis (and supports R) whereas most other tools are aimed at software developers (and don't support R). Dockter differs to these projects principally in that by default (but optionally) it installs the necessary Stencila language packages so that the image can talk to Stencila client interfaces an provide code execution services. Like `repo2docker` it allows for multi-language images but has the additional features of package dependency analysis of source code, managed builds and generated of image meta-data.
If you don't want to build a Docker image and just want a tool that helps determining the package dependencies of your source code check out:
- Node.js: [`detective`](https://github.com/browserify/detective)
- Python: [`modulefinder`](https://docs.python.org/3.7/library/modulefinder.html)
- R: [`hadley/requirements`](https://github.com/hadley/requirements)
## FAQ
*Why is this a Node.js package?*
We've implemented this as a Node.js package for easier integration into Stencila's Node.js based desktop and cloud deployments.
## Acknowledgments
Dockter was inspired by similar tools for researchers including [`binder`](https://github.com/binder-project/binder) and [`repo2docker`](https://github.com/jupyter/repo2docker). It relies on many great open source projects, in particular:
Dockter was inspired by similar tools for researchers including [`binder`](https://github.com/binder-project/binder), [`repo2docker`](https://github.com/jupyter/repo2docker) and [`containerit`](https://github.com/o2r-project/containerit). It relies on many great open source projects, in particular:

@@ -445,3 +391,4 @@ - [`crandb`](https://github.com/metacran/crandb)

- [`docker-file-parser`](https://www.npmjs.com/package/docker-file-parser)
- [`pypa`](https://warehouse.pypa.io)
- [`sysreqsdb`](https://github.com/r-hub/sysreqsdb)
- and of course, [Docker](https://www.docker.com/)

@@ -42,3 +42,3 @@ #!/usr/bin/env node

// @ts-ignore
.command('compile [folder] [format]', 'Compile a project to a software environment', yargs => {
.command('compile [folder]', 'Compile a project to a software environment', yargs => {
yargs.positional('folder', {

@@ -48,11 +48,5 @@ type: 'string',

describe: 'The path to the project folder'
}),
yargs.positional('format', {
type: 'string',
default: 'json',
describe: 'Format to output the environment: json or yaml'
})
}, async (args: any) => {
const node = await compiler.compile('file://' + args.folder, false).catch(error)
output(node, args.format)
await compiler.compile('file://' + args.folder, false).catch(error)
})

@@ -74,3 +68,3 @@

// @ts-ignore
.command('execute [folder] [format]', 'Execute a project', yargs => {
.command('execute [folder]', 'Execute a project', yargs => {
yargs.positional('folder', {

@@ -77,0 +71,0 @@ type: 'string',

import fs from 'fs'
import path from 'path'
import { SoftwareEnvironment } from '@stencila/schema'
import { SoftwareSourceCode, SoftwareEnvironment } from './context'
import Parser from './Parser'

@@ -7,0 +6,0 @@ import DockerParser from './DockerParser'

import Generator from './Generator'
import PythonGenerator from './PythonGenerator'
import RGenerator from './RGenerator'
import { SoftwareEnvironment } from './context'
import { SoftwareEnvironment } from '@stencila/schema'

@@ -6,0 +6,0 @@ const PREFERRED_UBUNTU_VERSION = '18.04'

@@ -0,5 +1,5 @@

import { SoftwareEnvironment, Person } from '@stencila/schema'
import parser from 'docker-file-parser'
import Parser from './Parser'
import { ComputerLanguage, SoftwarePackage, SoftwareEnvironment, push, Person } from './context'

@@ -91,3 +91,3 @@ /**

case 'author':
environ.authorsPush(Person.fromText(value))
environ.authors.push(Person.fromText(value))
break

@@ -103,3 +103,3 @@ }

else throw new Error(`Unexpected type of instruction arguments ${typeof instruction.args}`)
environ.authorsPush(Person.fromText(author))
environ.authors.push(Person.fromText(author))
}

@@ -106,0 +106,0 @@

import Doer from './Doer'
import { SoftwareEnvironment, SoftwarePackage } from './context'
import { SoftwareEnvironment, SoftwarePackage } from '@stencila/schema'

@@ -4,0 +4,0 @@ const VERSION = require('../package').version

import Doer from './Doer'
import { SoftwareEnvironment } from './context'
import { SoftwareEnvironment } from '@stencila/schema'

@@ -4,0 +4,0 @@ /**

import Generator from './Generator'
import { ComputerLanguage, SoftwareEnvironment } from './context'
import { SoftwareEnvironment } from '@stencila/schema'

@@ -46,3 +46,3 @@ const GENERATED_REQUIREMENTS_FILE = '.requirements.txt'

return this.filterPackages('Python').map(
requirement => requirement.identifier()
requirement => `${requirement.name}${requirement.version}`
).join('\n')

@@ -49,0 +49,0 @@ }

import Parser from './Parser'
import { SoftwareEnvironment, SoftwarePackage, SoftwareSourceCode } from './context'
import { SoftwareEnvironment, SoftwarePackage, SoftwareSourceCode } from '@stencila/schema'
import { basename } from 'path'

@@ -95,3 +95,3 @@

environ.softwareRequirementsPush(standardRequirement)
environ.softwareRequirements.push(standardRequirement)
} else if (rawRequirement.type === RequirementType.URL) {

@@ -98,0 +98,0 @@ let sourceRequirement = new SoftwareSourceCode()

import Generator from './Generator'
import { SoftwarePackage } from './context'

@@ -4,0 +3,0 @@ /**

import path from 'path'
import Parser from './Parser'
import { ComputerLanguage, SoftwarePackage, SoftwareEnvironment, push, Person } from './context'
import { SoftwarePackage, SoftwareEnvironment, Person } from '@stencila/schema'

@@ -153,7 +153,7 @@ /**

const roles = match[2].split(', ')
if (roles.includes('aut')) push(pkg, 'authors', person)
if (roles.includes('ctb')) push(pkg, 'contributors', person)
if (roles.includes('cre')) push(pkg, 'creators', person)
if (roles.includes('aut')) pkg.authors.push(person)
if (roles.includes('ctb')) pkg.contributors.push(person)
if (roles.includes('cre')) pkg.creators.push(person)
} else {
push(pkg, 'authors', Person.fromText(author))
pkg.authors.push(Person.fromText(author))
}

@@ -191,3 +191,3 @@ })

required.runtimePlatform = 'deb'
pkg.softwareRequirementsPush(required)
pkg.softwareRequirements.push(required)
} else if (Array.isArray(debPackage)) {

@@ -200,3 +200,3 @@ // Handle arrays e.g. curl https://sysreqs.r-hub.io/pkg/gsl

required.runtimePlatform = 'deb'
pkg.softwareRequirementsPush(required)
pkg.softwareRequirements.push(required)
}

@@ -207,3 +207,3 @@ if (deb.runtime) {

required.runtimePlatform = 'deb'
pkg.softwareRequirementsPush(required)
pkg.softwareRequirements.push(required)
}

@@ -210,0 +210,0 @@ }

import DockerCompiler from '../src/DockerCompiler'
import fixture from './fixture'
import { SoftwareEnvironment, Person } from '../src/context';
import { SoftwareEnvironment, Person } from '@stencila/schema';

@@ -5,0 +5,0 @@ /**

@@ -0,4 +1,5 @@

import { SoftwareEnvironment, SoftwarePackage } from '@stencila/schema'
import fixture from './fixture'
import DockerGenerator from '../src/DockerGenerator'
import { SoftwareEnvironment, SoftwarePackage } from '../src/context';

@@ -5,0 +6,0 @@ /**

import DockerParser from '../src/DockerParser'
import {Person, SoftwareEnvironment} from '../src/context'
import {Person, SoftwareEnvironment} from '@stencila/schema'
import fixture from './fixture'

@@ -14,3 +14,3 @@

environ = await parser.parse('FROM ubuntu') as SoftwareEnvironment
expect(environ.authors).toBeUndefined()
expect(environ.authors).toEqual([])

@@ -17,0 +17,0 @@ // Single label

@@ -1,2 +0,2 @@

import { SoftwareEnvironment, SoftwareApplication } from '../src/context'
import { SoftwareEnvironment, SoftwarePackage } from '@stencila/schema'
import PythonGenerator from '../src/PythonGenerator'

@@ -26,3 +26,3 @@ import fs from 'fs'

const arrowPackage = new SoftwareApplication()
const arrowPackage = new SoftwarePackage()
arrowPackage.name = 'arrow'

@@ -29,0 +29,0 @@ arrowPackage.version = '==0.12.1'

import fixture from './fixture'
import PythonParser, { RequirementType } from '../src/PythonParser'
import { SoftwareEnvironment, SoftwarePackage } from '../src/context'
import { SoftwareEnvironment, SoftwarePackage } from '@stencila/schema'

@@ -5,0 +5,0 @@ /**

import fixture from './fixture'
import RParser from '../src/RParser'
import RGenerator from '../src/RGenerator'
import { SoftwareEnvironment, SoftwarePackage } from '../src/context';
import { SoftwareEnvironment, SoftwarePackage } from '@stencila/schema'

@@ -6,0 +6,0 @@ /**

import fixture from './fixture'
import RParser from '../src/RParser'
import { SoftwareEnvironment } from '../src/context'
import { SoftwareEnvironment } from '@stencila/schema'

@@ -5,0 +5,0 @@ // Increase timeout (in milliseconds) to allow for HTTP requests

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc