Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

export-from-json

Package Overview
Dependencies
Maintainers
1
Versions
48
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

export-from-json

Export to txt, json, csv, xls, xml format file from valid JavaScript JSON object.

  • 1.7.4
  • latest
  • Source
  • npm
  • Socket score

Version published
Maintainers
1
Created
Source

Export From JSON

Export to plain text, css, html, json, csv, xls, xml files from JSON.

Known Vulnerabilities Maintainability language license Build Status npm version npm bundle size (minified + gzip) NPM

Installation

yarn add export-from-json

or

npm i --save export-from-json

or

pnpm i --save export-from-json

Usage

exportFromJSON supports CommonJS, EcmaScript Module, UMD importing.

exportFromJSON receives the option as the Types Chapter demonstrated, and it uses a front-end downloader as the default processor. In browser environment, there is a content size limitation on the default processor, consider using the server side solution.

In module system

import exportFromJSON from 'export-from-json'

const data = [{ foo: 'foo'}, { bar: 'bar' }]
const fileName = 'download'
const exportType =  exportFromJSON.types.csv

exportFromJSON({ data, fileName, exportType })

In browser

Check the codepen example

<script src="https://unpkg.com/export-from-json/dist/umd/index.min.js"></script>
<script>
    const data = [{ foo: 'foo'}, { bar: 'bar' }]
    const fileName = 'download'
    const exportType = 'csv'

    window.exportFromJSON({ data, fileName, exportType })
</script>

In Node.js server

exportFromJSON returns what the option processor returns, we can use it on server side for providing a converting/downloading service:

const http = require('http')
const exportFromJSON = require('export-from-json')

http.createServer(function (request, response){
    // exportFromJSON actually supports passing JSON as the data option. It's very common that reading it from http request directly.
    const data = '[{"foo":"foo"},{"bar":"bar"}]'
    const fileName = 'download'
    const exportType = 'txt'

    const result = exportFromJSON({
        data,
        fileName,
        exportType,
        processor (content, type, fileName) {
            switch (type) {
                case 'txt':
                    response.setHeader('Content-Type', 'text/plain')
                    break
                case 'css':
                    response.setHeader('Content-Type', 'text/css')
                    break
                case 'html':
                    response.setHeader('Content-Type', 'text/html')
                    break
                case 'json':
                    response.setHeader('Content-Type', 'text/plain')
                    break
                case 'csv':
                    response.setHeader('Content-Type', 'text/csv')
                    break
                case 'xls':
                    response.setHeader('Content-Type', 'application/vnd.ms-excel')
                    break
            }
            response.setHeader('Content-disposition', 'attachment;filename=' + fileName)
            return content
        }
    })

    response.write(result)
    response.end()
}).listen(8080, '127.0.0.1')

Types

Note: JSON refers to a parsable JSON string or a serializable JavaScript object.

Option nameRequiredTypeDescription
datatrueArray<JSON>, JSON or stringIf the exportType is 'json', data can be any parsable JSON. If the exportType is 'csv' or 'xls', data can only be an array of parsable JSON. If the exportType is 'txt', 'css', 'html', the data must be a string type.
fileNamefalsestringfilename without extension, default to 'download'
extensionfalsestringfilename extension, by default it takes the exportType
fileNameFormatterfalse(name: string) => stringfilename formatter, by default the file name will be formatted to snake case
fieldsfalsestring[] or field name mapper type Record<string, string>fields filter, also supports mapper field name by passing an name mapper, e.g. { 'bar': 'baz' }, default to undefined
exportTypefalseEnum ExportType'txt'(default), 'css', 'html', 'json', 'csv', 'xls', 'xml'
processorfalse(content: string, type: ExportType, fileName: string) => anydefault to a front-end downloader
withBOMfalsebooleanAdd BOM(byte order mark) meta to CSV file. BOM is expected by Excel when reading UTF8 CSV file. It is default to false.
beforeTableEncodefalse(entries: { fieldName: string, fieldValues: string[] }[]) => { fieldName: string, fieldValues: string[] }[]Given a chance to altering table entries, only works for CSV and XLS file, by default no altering.
delimiterfalse',' | ';'Specify CSV raw data's delimiter between values. It is default to ,

Tips

  • You can reference these exported types through a mounted static field types, e.g.
exportFromJSON({ data: jsonData, fileName: 'data', exportType: exportFromJSON.types.csv })
  • You can transform the data before exporting by beforeTableEncode, e.g.
exportFromJSON({
    data: jsonData,
    fileName: 'data',
    exportType: exportFromJSON.types.csv,
    beforeTableEncode: rows => rows.sort((p, c) => p.fieldName.localeCompare(c.fieldName)),
})

Keywords

FAQs

Package last updated on 28 Dec 2023

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc