Socket
Book a DemoInstallSign in
Socket

amplify-category-data-importer

Package Overview
Dependencies
Maintainers
1
Versions
10
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

amplify-category-data-importer

<!-- PROJECT LOGO -->

1.2.2
latest
npmnpm
Version published
Weekly downloads
11
Maintainers
1
Weekly downloads
 
Created
Source

Logo

Amplify Category 📡 Data Importer

The easiest way to import CSV files into DynamoDB.

View Demo · Report Bug · Request Feature

Contributors Forks Stargazers Issues License

WARNING: This plugin is in alpha, and may undergo backwards incompatible changes.

Table of Contents

About The Project

Amplify is great at replicating environments- but a database without data is a lonely place.

This project aims to automate the process of seeding/importing for Amplify projects.

Check out Installation to set up a S3 Bucket that streams data to your DynamoDB table.

Built With

Getting Started

To add this plugin to your Amplify project, follow these simple steps.

Prerequisites

Installation

  • Install the plugin from npm
npm install -g amplify-category-data-importer
  • Add the plugin to your project
amplify plugin add amplify-category-data-importer

Usage

Adding the resources

Add the data import resources to your amplify backend directory with:

amplify data-importer add
amplify push

Uploading CSV to DynamoDB

📃 Get a CSV file

A common use case is to export data from DynamoDB using the AWS Console, make some edits, and re-import it.

📝 Rename it

Change the name of the CSV file so it looks something like this:

Users-gkcm6todfzh5tlpgntm3lyrrgu-dev.csv

It must match the DynamoDB table you're targeting for upload.

🗑️ Drop it in the bucket

Done! 🎉 Your DynamoDB table is now seeded with data.

Note: other data types

By default this will upload data as strings.

If you have other types, edit the Lambda in the AWS Console.

Here's an example function to upload data based on type.

def write_row_to_dynamo(tableName, row):
    try:
        table = dynamodb.Table(tableName)
    except:
        print("Couldn't find DynamoDB table. Make sure the uploaded file name matches the table name.")

    try:
        with table.batch_writer() as batch:
            print(row['id'])
            batch.put_item(Item={
                'id': row['id'],
                '__typename': row['__typename'],
                'updatedAt': row['updatedAt'],
                'createdAt': row['createdAt'],
                'count': int(row['count']),
                'total': int(row['total']),
                })
    except Exception as e:

        print(e)

Roadmap

The short term goal is to reduce the amount of manual steps required for a CSV import workflow.

See the Github Project Roadmap for a list of proposed improvements.

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  • Fork the Project
  • Create your Feature Branch (git checkout -b feature/AmazingFeature)
  • Commit your Changes (git commit -m 'Add some AmazingFeature')
  • Push to the Branch (git push origin feature/AmazingFeature)
  • Open a Pull Request

License

Distributed under the ISC License. See LICENSE for more information.

Contact

Twitter - @lordrozar

Acknowledgements

FAQs

Package last updated on 15 Dec 2020

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

About

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.

  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc

U.S. Patent No. 12,346,443 & 12,314,394. Other pending.