
Security News
Risky Biz Podcast: Making Reachability Analysis Work in Real-World Codebases
This episode explores the hard problem of reachability analysis, from static analysis limits to handling dynamic languages and massive dependency trees.
@meowwolf/mwap
Advanced tools
This repo contains the code for the MeowWolf MWAPP.
This repo contains some preconfigured development tools:
This repo requires the use of pnpm. You can install pnpm with:
npm i -g pnpm
To enable automatic deployments to npm, please read the Continuous Deployment section.
In your terminal and install the packages by running:
pnpm install
It is also recommended that you install the following extensions in your VSCode editor:
To build the files, you have two defined scripts:
pnpm dev
: Builds and creates a local server that serves all files (check Serving files on development mode for more info).pnpm build
: Builds to the production directory (dist
).When you run pnpm dev
, two things happen:
watch
mode. Every time that you save your files, the project will be rebuilt.http://localhost:3000
that serves all your project files. You can import them in your Webflow projects like:<script defer src="http://localhost:3000/{FILE_PATH}.js"></script>
/bin/build.js
.You can review all the build files by updating the build settings.
In bin/build.js
, update the ENTRY_POINTS
array with any files you'd like to build:
const ENTRY_POINTS = [
'src/home/index.ts',
'src/contact/whatever.ts',
'src/hooyah.ts',
'src/home/other.ts',
];
This will tell esbuild
to build all those files and output them in the dist
folder for production and in http://localhost:3000
for development.
CSS files are also supported by the bundler. When including a CSS file as an entry point, the compiler will generate a minified version in your output folder.
You can define a CSS entry point by either:
bin/build.js
config. See previous section for reference.// src/index.ts
import './index.css';
CSS outputs are also available in localhost
during development mode.
Path aliases are very helpful to avoid code like:
import example from '../../../../utils/example';
Instead, we can create path aliases that map to a specific folder, so the code becomes cleaner like:
import example from '$utils/example';
You can set up path aliases using the paths
setting in tsconfig.json
. This template has an already predefined path as an example:
{
"paths": {
"$utils/*": ["src/utils/*"]
}
}
To avoid any surprises, take some time to familiarize yourself with the tsconfig enabled flags.
In general, your development workflow should look like this:
CHANGELOG.md
, you should also merge that one. If you have automatic npm deployments enabled, Changesets will also publish this new version on npm.If you need to work on several features before publishing a new version on npm, it is a good practise to create a development
branch where to merge all the PR's before pushing your code to master.
This template contains a set of predefined scripts in the package.json
file:
pnpm dev
: Builds and creates a local server that serves all files (check Serving files on development mode for more info).pnpm build
: Builds to the production directory (dist
).pnpm lint
: Scans the codebase with ESLint and Prettier to see if there are any errors.pnpm lint:fix
: Fixes all auto-fixable issues in ESLint.pnpm check
: Checks for TypeScript errors in the codebase.pnpm format
: Formats all the files in the codebase using Prettier. You probably won't need this script if you have automatic formatting on save active in your editor.pnpm test
: Will run all the tests that are located in the /tests
folder.pnpm test:headed
: Will run all the tests that are located in the /tests
folder visually in headed browsers.pnpm release
: This command is defined for Changesets. You don't have to interact with it.pnpm run update
: Scans the dependencies of the project and provides an interactive UI to select the ones that you want to update.The authentication process is handled by the MWAuth
class. This class is responsible for handling the authentication process and storing the user's session in the browser's local storage. Its a convenient way to handle the Auth0 process, this class actually wraps the auth0-spa-js
library and provide some helpers on top of it.
An instance can also be retrieved from the 'window' global object, allowing you to access it from anywhere in the Webflow side. For example, you could check if the user is authenticated by doing:
const { mwAuth } = window;
if (mwAuth.isAuthenticated) {
// Do something
}
Below you'll find instructions on how to interact with the Wormhole API's using the APIController
with some examples and utilities.
The APIController
is a class used to interact with a GraphQL API. It uses an asynchronous approach and is designed to handle the complexity of authorization, query initialization, and error management.
import APIController from 'path/to/APIController';
let apiController = new APIController('https://your-api-url.com');
The APIController takes a single argument during instantiation, which is the URL of your GraphQL API.
const someData = await apiController.execute(queryName, variables);
The execute method is used to run a predefined GraphQL query. The method expects two parameters:
queryName
: A string representing the name of the query to execute. The query must have been predefined and stored in the APIController instance.variables
: An optional object containing variables to be passed into the GraphQL query.
The execute method will return a Promise resolving to the data from the executed query.const customQuery = gql`
query Person {
Person {
id
givenName
affiliationId
affiliation {
id
}
}
}
`;
const someData = await apiController.query(customQuery);
The query
method is used to run a GraphQL query that is not predefined. The method expects two parameters:
queryDocument
: A DocumentNode representing the GraphQL query to be executed. This is typically a parsed GraphQL string. For convenience, you can use the gql
function from graphql-tag
to parse a GraphQL string.
variables
: An optional object containing variables to be passed into the GraphQL query.
The query method will return a Promise that resolves to the data returned from the GraphQL query.
This section demonstrates how to use the utility functions to build a complex GraphQL query variable.
First, import the necessary utility functions:
// Step 1: Import the utility functions
// import { buildWhereClause, buildOrderByClause, buildLimitClause } from './utils';
// Step 2: Define the inputs
const nestStructure = ['id'];
const operation = 'equals';
const value = 1;
const field = 'name';
const direction = 'asc';
const limit = 10;
// Step 3: Use the utility functions to build the required variable object
const whereClause = buildWhereClause(nestStructure, operation, value);
const orderByClause = buildOrderByClause(field, direction);
const limitClause = buildLimitClause(limit);
/* The above would result in:
{
where: { id: { _eq: 1 } },
order_by: { name: 'asc' },
limit: 10
}
*/
// Step 4: Combine the clauses
const queryArguments = {
...whereClause,
...orderByClause,
...limitClause,
};
The above variable object can then be passed into the execute
or query
methods as the variables
parameter.
In case of GraphQL errors, both execute and query methods will throw an Error. Make sure to handle this properly. An example would be:
try {
const data = await apiController.execute('getUser', { userId: 1 });
console.log(data);
} catch (error) {
console.error('An error occurred', error);
}
This class uses an access token stored in session storage for authorizing requests. It is assumed that the session storage contains an item with a key as defined in ACCESS_TOKEN from '$auth/utils/constants'
. The token is then added to each request as a Bearer token in the Authorization header.
On instantiation, the APIController class initializes its predefined queries. The method used for this is inizializeQueries
, which loads queries using loadQueries function. Any error during this process would prevent the instance from being created.
New queries need to be added to the queries
object in the initializeQueries
method. The queries
object is a key-value pair where the key is the name of the query and the value is the query itself. The query must be a DocumentNode, which can be created using the gql
function from graphql-tag
.
This repo contains a set of helpers with proper CI/CD workflows. TODO: Add deployment instructions for vendor of choise: Cloudflare Workers?
When you open a Pull Request, a Continuous Integration workflow will run to:
pnpm lint
and pnpm check
commands under the hood.pnpm test
command under the hood.If any of these jobs fail, you will get a warning in your Pull Request and should try to fix your code accordingly.
Changesets allows us to generate automatic changelog updates when merging a Pull Request to the master
branch.
Before starting, make sure to enable full compatibility with Changesets in the repository.
To generate a new changelog, run:
pnpm changeset
You'll be prompted with a few questions to complete the changelog.
Once the Pull Request is merged into master
, a new Pull Request will automatically be opened by a changesets bot that bumps the package version and updates the CHANGELOG.md
file.
You'll have to manually merge this new PR to complete the workflow.
If an NPM_TOKEN
secret is included in the repository secrets, Changesets will automatically deploy the new package version to npm.
See how to automatically deploy updates to npm for more info.
Some repositories may not have the required permissions to let Changesets interact with the repository.
To enable full compatibility with Changesets, go to the repository settings (Settings > Actions > General > Workflow Permissions
) and define:
Enabling this setting for your organization account (Account Settings > Actions > General
) could help streamline the process. By doing so, any new repos created under the org will automatically inherit the setting, which can save your teammates time and effort. This can only be applied to organization accounts at the time.
As mentioned before, Changesets will automatically deploy the new package version to npm if an NPM_TOKEN
secret is provided.
This npm token should be:
Once you're logged into the npm account, you can get an access token by following this guide.
The access token must be then placed in a repository secret named NPM_TOKEN
.
master
branchpnpm changeset
to create a new changeset.NPM_TOKEN
secret set up in your repository.pnpm build
to build the project.pnpm changeset pre enter next
to create a new changeset.pnpm changeset version
to bump the package version.git add .
git commit -m "enter pre-release and bumped version"
pnpm changeset publish
NPM_TOKEN
secret set up in your repository.pnpm build
to build the project.pnpm changeset
to create a new changeset.pnpm changeset version --snapshot
to bump the package version.pnpm changeset version --snapshot <custom_name>
to add a custom snapshot name.git add .
git commit -m "created snapshot version"
pnpm changeset publish --no-git-tag --snapshot
to publish the new version to npm.NPM_TOKEN
secret set up in your repository.pnpm build
to build the project.pnpm changeset
to create a new changeset.pnpm changeset version
to bump the package version.git add .
git commit -m "bumped version"
pnpm changeset publish
to publish the new version to npm.Calendar date formatting is cleaned up:
FAQs
Developer starter template for Finsweet projects.
We found that @meowwolf/mwap demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 19 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
This episode explores the hard problem of reachability analysis, from static analysis limits to handling dynamic languages and massive dependency trees.
Security News
/Research
Malicious Nx npm versions stole secrets and wallet info using AI CLI tools; Socket’s AI scanner detected the supply chain attack and flagged the malware.
Security News
CISA’s 2025 draft SBOM guidance adds new fields like hashes, licenses, and tool metadata to make software inventories more actionable.