
Product
Rust Support Now in Beta
Socket's Rust support is moving to Beta: all users can scan Cargo projects and generate SBOMs, including Cargo.toml-only crates, with Rust-aware supply chain checks.
gatsby-transformer-json
Advanced tools
Parses raw JSON strings into JavaScript objects e.g. from JSON files. Supports arrays of objects and single objects.
npm install gatsby-transformer-json
If you want to transform JSON files, you also need to have gatsby-source-filesystem
installed and configured so it
points to your files.
In your gatsby-config.js
:
module.exports = {
plugins: [
`gatsby-transformer-json`,
{
resolve: `gatsby-source-filesystem`,
options: {
path: `./src/data/`,
},
},
],
}
You can choose to structure your data as arrays of objects in individual files or as single objects spread across multiple files.
The algorithm for arrays is to convert each item in the array into a node.
So if your project has a letters.json
with
[{ "value": "a" }, { "value": "b" }, { "value": "c" }]
Then the following three nodes would be created:
[{ "value": "a" }, { "value": "b" }, { "value": "c" }]
The algorithm for single JSON objects is to convert the object defined at the root of the file into a node. The type of the node is based on the name of the parent directory.
For example, let's say your project has a data layout like:
data/
letters/
a.json
b.json
c.json
Where each of a.json
, b.json
and c.json
look like:
{ "value": "a" }
{ "value": "b" }
{ "value": "c" }
Then the following three nodes would be created:
[
{
"value": "a"
},
{
"value": "b"
},
{
"value": "c"
}
]
Regardless of whether you choose to structure your data in arrays of objects or single objects, you'd be able to query your letters like:
{
allLettersJson {
edges {
node {
value
}
}
}
}
Which would return:
{
allLettersJson: {
edges: [
{
node: {
value: "a",
},
},
{
node: {
value: "b",
},
},
{
node: {
value: "c",
},
},
]
}
}
typeName
[string|function][optional]
The default naming convention documented above can be changed with either a static string value (e.g. to be able to query all json with a simple query):
module.exports = {
plugins: [
{
resolve: `gatsby-transformer-json`,
options: {
typeName: `Json`, // a fixed string
},
},
],
}
{
allJson {
edges {
node {
value
}
}
}
}
or a function that receives the following arguments:
node
: the graphql node that is being processed, e.g. a File node with
json contentobject
: a single object (either an item from an array or the whole json content)isArray
: boolean, true if object
is part of an array[
{
"level": "info",
"message": "hurray"
},
{
"level": "info",
"message": "it works"
},
{
"level": "warning",
"message": "look out"
}
]
module.exports = {
plugins: [
{
resolve: `gatsby-transformer-json`,
options: {
typeName: ({ node, object, isArray }) => object.level,
},
},
],
}
{
allInfo {
edges {
node {
message
}
}
}
}
The gatsbygram example site uses this plugin.
If some fields are missing or you see the error on build:
There are conflicting field types in your data. GraphQL schema will omit those fields.
It's probably because you have arrays of mixed values somewhere. For instance:
{
"stuff": [25, "bob"],
"orEven": [
[25, "bob"],
[23, "joe"]
]
}
If you can rewrite your data with objects, you should be good to go:
{
"stuff": [{ "count": 25, "name": "bob" }],
"orEven": [
{ "count": 25, "name": "bob" },
{ "count": 23, "name": "joe" }
]
}
Else, if your data doesn't have a consistent schema, like TopoJSON files, or you can't rewrite it, consider placing the JSON file inside the static
folder and use the dynamic import syntax (import('/static/myjson.json')
) within the componentDidMount
lifecycle or the useEffect
hook.
id
and jsonId
keyIf your data contains an id
key the transformer will automatically convert this key to jsonId
as id
is a reserved internal keyword for Gatsby.
FAQs
Gatsby transformer plugin for JSON files
The npm package gatsby-transformer-json receives a total of 34,302 weekly downloads. As such, gatsby-transformer-json popularity was classified as popular.
We found that gatsby-transformer-json demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 7 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket's Rust support is moving to Beta: all users can scan Cargo projects and generate SBOMs, including Cargo.toml-only crates, with Rust-aware supply chain checks.
Product
Socket Fix 2.0 brings targeted CVE remediation, smarter upgrade planning, and broader ecosystem support to help developers get to zero alerts.
Security News
Socket CEO Feross Aboukhadijeh joins Risky Business Weekly to unpack recent npm phishing attacks, their limited impact, and the risks if attackers get smarter.